Welcome!

Related Topics: @CloudExpo, Open Source Cloud

@CloudExpo: Blog Post

Balance Between User Base and Community in OpenStack and OpenNebula

Market Share versus Investment

This post is a reprint of a post at OpenNebula.org

In our last post "OpenNebula vs. OpenStack: User Needs vs. Vendor Driven" we stated that"OpenStack penetration in the market is relatively small compared with the investment made by vendors and VCs". We have received several emails from people asking for the numbers that support this statement. This conclusion arises from the comparison between OpenNebula and OpenStack user base, a well as between the resources invested in development and marketing by each of them.

User Base
OpenStack is experiencing explosive growth in the number of developers, with more than 200 companies contributing code, 15,000 people and 850 companies involved according to its web site, and almost 1,000 developers involved in its latest release. However, the number of users and the size of the deployments are not that impressive, at least compared with this software development force.

Let us compare the user base of OpenNebula and OpenStack by using their latest surveys:

  • According to the most recent OpenStack user survey (November 2013), they received 827 responses, and 387 were deployments. In the 80% of these deployments the number of nodes was below 100, and only 11 deployments with more than 1,000 nodes (hypervisors).
  • On the other hand, in the latest OpenNebula survey (November 2012), OpenNebula received 2,500 responses, 820 of these were deployments. In the 70% of these deployments the number of nodes was below 100 nodes, and 99 deployments have more than 500 nodes (hypervisors).

nodes

We avoid giving references to featured users, both projects could put on the table good references of large-scale cloud deployments. The surveys show that OpenNebula and OpenStack are achieving a similar level of deployment. However, OpenStack presents a ratio 1/40 between deployments in the survey and number of people involved, a ratio 1/3 between deployments and developers, and a ratio 1/2 between deployments and companies involved. Not every company contributed to the survey?.

We could also use the volume of web searches according to Google Trends to compare the impact of both projects. The ratio in the number of searchers between OpenNebula and OpenStack during the last 12 years is 1/20. This mainly reflects the successful marketing of OpenStack. OpenNebula mainly invests its resources in developing technology and serving its users, being really vendor agnostic and free of marketing.

There is also a quarterly comparative analysis of the community activity (mailing lists traffic mostly) of the four main open-source cloud management platforms: OpenStack, OpenNebula, Eucalyptus and CloudStack. The number of threads and participants in OpenStack is one order of magnitude higher than in OpenNebula. This mostly reflects a higher number of developers. Moreover, it is also worth noting that development coordination in OpenNebula is done through a redmine portal and not through a mailing list.

Resources Invested
We conservatively estimate the investment in OpenStack in approximately 300 million per year:

  • OpenStack Havana involved 950 developers almost completely hired by vendors. This is approximately $150 Million per year
  • OpenStack Foundation budget is approximately $10 Million per year
  • Just seven of the many start-ups involved in OpenStack have raised $120 million from VC. Assuming this is for 3 years. This is approximately $40 million per year
  • There are other direct costs from other many companies, there are almost 1,000 companies involved, that are also allocating resources to development, training, documentation..., a big overhead in indirect costs, and of course opportunity costs.

So $300 million per year is a good conservative estimate. We have seen other estimations above $0.5 billion per year, some reaching to $1 billion per year. In any case, over a few years, it's billions. Are these companies getting this money back?. I see VC's starting to ask "Where's our future money?". Summarizing, a relatively small user base, and so penetration in the market, compared with the investment made by vendors and VCs. OpenNebula, with a budget at least two orders of magnitude lower, is achieving a similar user base. You can draw your own conclusions.

More Stories By Ignacio M. Llorente

Dr. Llorente is Director of the OpenNebula Project and CEO & co-founder at C12G Labs. He is an entrepreneur and researcher in the field of cloud and distributed computing, having managed several international projects and initiatives on Cloud Computing, and authored many articles in the leading journals and proceedings books. Dr. Llorente is one of the pioneers and world's leading authorities on Cloud Computing. He has held several appointments as independent expert and consultant for the European Commission and several companies and national governments. He has given many keynotes and invited talks in the main international events in cloud computing, has served on several Groups of Experts on Cloud Computing convened by international organizations, such as the European Commission and the World Economic Forum, and has contributed to several Cloud Computing panels and roadmaps. He founded and co-chaired the Open Grid Forum Working Group on Open Cloud Computing Interface, and has participated in the main European projects in Cloud Computing. Llorente holds a Ph.D in Computer Science (UCM) and an Executive MBA (IE Business School), and is a Full Professor (Catedratico) and the Head of the Distributed Systems Architecture Group at UCM.

Latest Stories
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...