|By Jiten Patil||
|July 16, 2012 05:00 AM EDT||
Open source has proven to be a good option for building, managing, and delivering scalable infrastructure-as-a-service (IaaS) clouds and platform-as-a-service (PaaS) clouds. Typically, most open source cloud platforms support multiple virtualization technologies, giving enterprises a range of choices from multiple vendors of closed as well as open source technologies. Some examples are Eucalyptus, OpenStack, Cloud Foundry, OpenNebula, Red Hat OpenShift, Xen Cloud Platform Project (XCP), and the newest kid on the block, Citrix Cloudstack 3. While some apprehension still exists around open source use, there is a shift in attitude as enterprises look to capitalize on efficiency and technologies like virtualization and cloud computing as these become highly essential components in IT architecture.
OpenStack is a massively scalable cloud operating system that helps in the delivery and management of infrastructure. OpenStack is initiated by Rackspace and NASA, and is supported by almost 180 organizations, including Intel, Dell, Canonical, AMD, Cisco, HP, SUSE Linux, Red Hat, and IBM. It is a collaborative effort by thousands of developers and technologists globally aimed at helping SMBs, service providers, data centers, corporations and researchers roll out and leverage industry-grade public and private clouds.
Cloud Foundry is an open PaaS initiated by VMware enabling users to choose from multiple deployment clouds, development frameworks, and application services. Eucalyptus has been in the space for more than three years and it helps with implementing IaaS clouds. It also provides a great hybrid cloud deployment option since it supports Amazon AWS application programming interfaces to build and deliver applications atop. Red Hat's OpenShift, an auto-scaling PaaS, is also getting traction and support from many organizations. XCP helps with server virtualization and building cloud platforms for enterprises using Xen hypervisor. OpenNebula is another open source standard for data center virtualization. It offers customizable solutions for the management of virtualized data centers based on Xen, VMware, and KVM. Citrix CloudStack on the other hand is supported by almost 50 organizations including software and service providers like RightScale, Engine Yard, Opscode, CumuLogic, Puppet Labs, Hortonworks, Equinix, Juniper Networks, and ScaleXtreme.
Today, the above referenced open source platforms and technologies are being adopted and leveraged by small to large organizations for different reasons and for different capacity. Some are being used to create and deliver internal infrastructure, applications or workloads; while others are being leveraged to build public cloud services. Similarly, the open source virtualization platforms like Xen and KVM are already a critical piece of many cloud solutions today. One such example is Amazon Web Services (AWS) cloud platform. AWS IaaS cloud is perhaps the most popular public cloud - estimated to be a $1 billion cloud business. Amazon's core cloud service EC2 (compute service) is powered by Xen. This alone should provide some reassurance to people who are still skeptical, but despite all the advancements in open source cloud platforms, CIOs have been apprehensive about open source software because of the absence of a formal support infrastructure. Some other concerns include security, lack of proper roadmaps, complexities tagged with IP rights, and capabilities to evaluate and assert endorsements to open source projects. But it's not all negative - open source innovations are acting as a catalyst to cause positive shifts in computing paradigm; ultimately helping CIOs, and we continue to see the market size of open source cloud software get bigger day by day.
Open source is also driving another interesting change. When it comes to contributing to open source initiatives, it's all about co-creation. Today, CIOs or service providers are all expected to "do more with less," which results in tight constraints on budgets. Timelines foster the culture of co-creation through collaborative efforts rather than competition. A good thing is that now organizations are getting multiple open source options to solve problems and they are able to choose the specific one that will best help meet their requirements. We've reached a point where it's important for organizations to create a strategy around open source platforms, and target the ones that align with business propositions that help meet strategic goals rather than focusing on tactical goals only. In the context of cloud computing, experts believe that open source is promising to make the technologies behind the cloud a commodity.
Sometimes people tend to misunderstand the cost model associated with open source and hence fail to account for the inevitable costs as well. There seems to be confusion between a no-cost vs. low-cost option. Although open source software doesn't incur any cost for acquisition since there is no license fees or annual charges, organizations still need to account for administration and support costs. In any case compared with commercial software platforms, total cost of ownership will be significantly lower.
Security has always been a talked-about issue with open source and it's a very valid concern. In the new era of open source innovations, the fact that communities and participations sponsored by technology organizations are not only getting stronger, but also fueling the advancement through co-innovation patterns, has changed the perspective on open source security as well. Open source software is available to anyone and everyone to use and work with. This means a large community of developers globally contribute to the code; they inspect it, review it, test it for various scenarios, and analyze the code for vulnerabilities. This process makes the open source software more secure, delivering the quality software back, to test and inspect it again and again. It keeps evolving all the time, while intelligence is added by many brains. In the context of a commercial software product, the vendor organization helps you solve the problem; and in the open source scenario, perhaps the whole technology world conspires toward your success.
There are other advantages of open source in cloud as well as virtualization - affordability (lower TCO), flexibility to customize, transparency in the stack, no vendor lock-in, better interoperability, and commitment toward portability when it comes to migrating workload. Having a choice of selecting technologies, frameworks and tools to build applications and peripherals proves to be another advantage. The value of community participation and support from a vast base of developers across geographies brings more firepower and extra leeway.
No doubt closed source projects will continue to have markets; however, they will be put under constant pressure due to the impact and penetration open source platforms are creating and are capable of advancing. Open source is very much happening and cutting across all the layers of cloud architecture. This means organizations today can't afford to ignore open source and will need to either do talent investment, experimentation or lay out a full-blown strategy when thinking about cloud as an enabler. The future may belong to businesses that take on new technology bets - try different computing and delivery approaches, experiment with constantly evolving technologies that are more open and collaborative. Open source adoption is and will continue fueling the disruption that new edge technologies like cloud computing are causing... what do you think?
Palerra, the cloud security automation company, announced enhanced support for Amazon AWS, allowing IT security and DevOps teams to automate activity and configuration monitoring, anomaly detection, and orchestrated remediation, thereby meeting compliance mandates within complex infrastructure deployments. "Monitoring and threat detection for AWS is a non-trivial task. While Amazon's flexible environment facilitates successful DevOps implementations, it adds another layer, which can become a ...
Jul. 31, 2015 10:15 PM EDT Reads: 303
With SaaS use rampant across organizations, how can IT departments track company data and maintain security? More and more departments are commissioning their own solutions and bypassing IT. A cloud environment is amorphous and powerful, allowing you to set up solutions for all of your user needs: document sharing and collaboration, mobile access, e-mail, even industry-specific applications. In his session at 16th Cloud Expo, Shawn Mills, President and a founder of Green House Data, discussed h...
Jul. 31, 2015 04:30 PM EDT Reads: 424
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Jul. 31, 2015 03:00 PM EDT Reads: 493
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobi...
Jul. 31, 2015 02:30 PM EDT Reads: 276
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, analyzed a range of cloud offerings (IaaS, PaaS, SaaS) and discussed the benefits/challenges of migrating to each offe...
Jul. 31, 2015 02:30 PM EDT
Chuck Piluso presented a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. Prior to Secure Infrastructure and Services, Mr. Piluso founded North American Telecommunication Corporation, a facilities-based Competitive Local Exchange Carrier licensed by the Public Service Commission in 10 states, serving as the company's chairman and president from 1997 to 2000. Between 1990 and 1997, Mr. Piluso served as chairman & founder of International Te...
Jul. 31, 2015 02:00 PM EDT Reads: 366
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Jul. 31, 2015 01:45 PM EDT Reads: 102
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
Jul. 31, 2015 11:45 AM EDT Reads: 132
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
Jul. 31, 2015 11:45 AM EDT Reads: 123
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Jul. 31, 2015 10:00 AM EDT Reads: 146
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducte...
Jul. 31, 2015 08:45 AM EDT Reads: 304
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
Jul. 31, 2015 08:00 AM EDT Reads: 163
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Jul. 30, 2015 07:30 PM EDT Reads: 1,402
Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies - speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating...
Jul. 30, 2015 06:30 PM EDT Reads: 894
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
Jul. 30, 2015 06:00 PM EDT Reads: 1,807