Click here to close now.


Related Topics: Open Source Cloud, Java IoT, Microservices Expo, Linux Containers, Apache

Open Source Cloud: Article

The Commercial Case for Open Source Software

What can a full-blown open source project with sound commercial routes ultimately achieve?

This post is written in association with Pentaho, a commercial open-source (COSS) provider of reporting, analysis, dashboard, data mining and data integration software.

The history of open source has already been written and rewritten a couple of times, so there's no need to go back to Genesis chapter one and revisit Linus Torvalds' "just a hobby, won't be big" comments too often.

But open source became more than the sum of its parts and the hobbyists grew successful in domains that traditionally belonged to their proprietary relatives.

Historical Note: If you do still want the history of open source, then the YouTube hosted Revolution OS is about 100 minutes of the best open development commentary you will find.

Open source grew up, we know that part. With a rich pedigree of success in the server room, open platforms eventually moved upwards through the commercial sector and across to government in many developed nations.

What open source in these (and other mission-critical implementations) demands is not only the strong active developer community that typifies any open code base - it also very often needs a level of expert support and maintenance that works at a more formalized level than that which is available for free through the community. This especially applies to teams that are trying to solve ‘hairy' problems for which skills are in short supply, like blending and analyzing diverse, ‘big' data sets.

Support and maintenance are important, but there's another factor here.

Locked Down, Demarcated Openness
More specifically (and more technically), open code is built with inherently dynamic libraries that are subject to change and community contribution at any time. However, commercial versions of open source software are always locked down and demarcated at the point of sale and therefore not subject to these dynamic changes.

This means that when organizations like NASA and the Met Office (arguably ‘mission critical') use commercial open source software, they are able to define the exact static form and function of applications at the point of installation.

This effectively eliminates the risk factors inherent with open code dynamism.

Other ‘COSS Benefits'
A good commercial open source software (COSS) project works with lead developers that are professional and paid competitive salaries.

A good COSS model works with a high quality assurance (QA) cycle for the open source project and a full set of services.

A good COSS model works with professional support offerings that must be available so clients can depend on timely and accurate assistance.

COSS is has proven in many situations to be more secure than proprietary because of its larger development base. This is why the US Department of Homeland Security, which has incredibly high standards, promotes its use [].

Commercial Drivers
According to Pentaho, we will see that, ultimately, commercial open source helps drive open source adoption.

The theory is that many more organizations will use open source software if they have access to support and services. Beyond that, a company behind an open source project helps assure potential users that the project has consistent vision, discipline, and longevity.

What can a full-blown open source project with sound commercial routes ultimately achieve?

Sun Microsystems famously took the Java platform forward to be one of the most high-profile open (with commercial options) projects. Fast forward to today and the equally open Pentaho BI Project has worked to develop a comprehensive analytics platform that includes reporting, analysis, dashboards, data mining and ETL for true production deployment.

According to the company, "Many other projects that exist address a specific function like reporting, but not the entire BI spectrum. Most also lack the necessary infrastructure like security, administration, auditing, fail-over, scalability features, portal, and other key framework functionality. Beyond that, some projects offer open source reporting, but require an upgrade to an expensive, closed-source offering for web-based deployment or other BI platform functionality."

Looking Forward to (Commercial) Open Source Futures
If Torvalds' view of open source was 1.0 and the commercially supported iteration of open source was 2.0, then might we naturally expect version 3.0 to come forward at some point now?

This post is written in association with Pentaho. The firm has exerted zero editorial influence over the content presented here and simply seeks to fuel dialogue and discussion based around its mission to provide a cost-effective business analytics platform that fuels growth and innovation.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Latest Stories
DevOps is here to stay because it works. Most businesses using this methodology are already realizing a wide range of real, measurable benefits as a result of implementing DevOps, including the breakdown of inter-departmental silos, faster delivery of new features and more stable operating environments. To take advantage of the cloud’s improved speed and flexibility, development and operations teams need to work together more closely and productively. In his session at DevOps Summit, Prashanth...
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, will explore HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, will review the current landscape of...
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
In their session at DevOps Summit, Asaf Yigal, co-founder and the VP of Product at, and Tomer Levy, co-founder and CEO of, will explore the entire process that they have undergone – through research, benchmarking, implementation, optimization, and customer success – in developing a processing engine that can handle petabytes of data. They will also discuss the requirements of such an engine in terms of scalability, resilience, security, and availability along with how the archi...
DevOps is gaining traction in the federal government – and for good reasons. Heightened user expectations are pushing IT organizations to accelerate application development and support more innovation. At the same time, budgetary constraints require that agencies find ways to decrease the cost of developing, maintaining, and running applications. IT now faces a daunting task: do more and react faster than ever before – all with fewer resources.
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Docker is hot. However, as Docker container use spreads into more mature production pipelines, there can be issues about control of Docker images to ensure they are production-ready. Is a promotion-based model appropriate to control and track the flow of Docker images from development to production? In his session at DevOps Summit, Fred Simon, Co-founder and Chief Architect of JFrog, will demonstrate how to implement a promotion model for Docker images using a binary repository, and then show h...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/...
Clutch is now a Docker Authorized Consulting Partner, having completed Docker's certification course on the "Docker Accelerator for CI Engagements." More info about Clutch's success implementing Docker can be found here. Docker is an open platform for developers and system administrators to build, ship and run distributed applications. With Docker, IT organizations shrink application delivery from months to minutes, frictionlessly move workloads between data centers and the cloud and achieve 2...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text...
Chris Van Tuin, Chief Technologist for the Western US at Red Hat, has over 20 years of experience in IT and Software. Since joining Red Hat in 2005, he has been architecting solutions for strategic customers and partners with a focus on emerging technologies including IaaS, PaaS, and DevOps. He started his career at Intel in IT and Managed Hosting followed by leadership roles in services and sales engineering at Loudcloud and Linux startups.