Welcome!

Related Topics: @CloudExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Blog Feed Post

Is PaaS Dying?

The ‘platform’ tier in the middle of cloud computing’s architecture is being squeezed

The ‘platform’ tier in the middle of cloud computing’s architecture is being squeezed, folded and reshaped beyond recognition. Even with continued investment, can it survive the transformative pressures forcing down upon it from the software/application layer above, or the apparently inexorable upward movement from the infrastructure layer upon which it rests?

To look at recent investments and enthusiastic headlines, it would be easy to assume that Platform as a Service (or PaaS) is on the up. RedHat recently trumpeted the launch of OpenShift Enterprise — a ‘private PaaS,’ whatever that might be. Eagerly tracked super-startup Pivotal pushed PivotalOne out to the world, strengthening the position of the Cloud Foundry PaaS offering upon which it sits. Apprenda, a PaaS that almost predates wider recognition of the term, secured an additional $16 million to continue expanding. And, more tightly integrated into Salesforce’s latest vision for world domination, Heroku continues to attract enthusiasts.

457px-Nuremberg_chronicles_-_Phoenix_(CIIIIv)And yet, the role of rich PaaS ‘solutions’ is under increasing pressure. More lightweight approaches such as Docker are attracting attention and, perhaps more importantly, the other layers of the cloud architecture are adding capabilities that look increasingly PaaS-like. The orchestration capabilities of Amazon’s Elastic Beanstalk, for example, mean that many (but by no means all) AWS users no longer need the PaaS tools they might previously have integrated into their toolkit. We’ll keep needing PaaS functionality, but it may not be long before the idea of a separate PaaS solution no longer makes good sense.

For many years, some of the most basic explanations of cloud have broken it into three layers;

  • at the top, Applications, Services and Software. The things most people see and interact with. The GMails and Salesforces and Boxes of the world;
  • at the bottom, Infrastructure. The nuts and bolts. The engine room. The servers and routers and networks. To paraphrase former colleague Ray Lester, the stuff;
  • and, in the middle, the Platform. The piece that assembles bits of network and bits of infrastructure and bits of code, and simplifies the process of knitting them all together in order to deliver one of those apps or services. The glue, if you will.

The role of the platform is clear, compelling, and powerful. It should be the fundamental piece, far more important and interesting than a bunch of cheap virtual machines running on commodity hardware. It should be the driving force behind cloud; the reason cloud can continue to transform businesses and business models around the world. It should be all that and more, but PaaS as a category falls far short of this promise.

In early planning for VentureBeat’s second CloudBeat conference, in 2012, Ben Kepes and I argued for PaaS, PaaS vendors and PaaS customers to be given real prominence. We knew that the story of the glue was where this whole industry needed to shift. That’s still true today. The glue remains important, but maybe it’s less clear that we need — or that the market can sustain — glue companies. Instead, those capabilities are increasingly to be found across a range of loosely coupled components, or in the offerings of Applications and Infrastructure providers both above and below the PaaS layer. CenturyLink’s recent acquisition of Tier3 is a clear attempt to address exactly this, moving up from the Infrastructure layer.

I’m far from alone in asking questions about PaaS in its current form. My friend René Büst, for example, argued this week that PaaS is typically used for prototyping work but that it doesn’t permit sufficiently granular control for the most efficient delivery of enterprise-grade applications. Possibly an over-simplification, but it’s still a sentiment that is increasingly repeated. Over at Gigaom, Barb Darrow has been asking the question too, most recently with So… do you really need a PaaS? For now, Barb appears unsure about how to answer her own question… but the comments on her post are pretty conclusively in the affirmative. Matt Asay offers his own take on the Twitter conversation which inspired Barb, writing a more up-beat piece for ReadWrite;

The ‘platform as a service’ market—or PaaS, in which cloud companies provide developers with hardware, OS and software tools and libraries—is starting to heat up. IDC predicts it will $14 billion by 2014, and competitors are angling for enterprise wallets.

Matt closes by stressing the importance of solid, sustainable customer adoption; a very different thing from the froth, page views, and jockeying for popularity that seem to underpin much of the conversation today.

Another friend, Ben Kepes, has been tracking the PaaS space closely for several years, and recently commented;

It’s a strange time for PaaS in general. Pivotal One’s flavor of Cloud Foundry seems to be sucking up the vast majority of the mindshare leaving other Cloud Foundry vendors scratching their heads over how to differentiate. At the same time RedHat is trying to achieve some kind of breakout velocity for its own version of PaaS, OpenShift. Stalwarts Heroku (now owned by Salesforce.com) and EngineYard keep turning the PaaS wheel also. Add to that the fact that some of the OpenStack players have decided to create their own PaaS initiative, Solum, and you have for a confused and confusing market. Throw the monsters from Seattle, AWS and Microsoft, on top of that and seemingly there is one vendor for every one of the half dozen companies in the world that have actually made a decision to buy PaaS.

From here in sunny (and, for once, it actually is) East Yorkshire, the various PaaS vendors appear hard-pressed to tell a truly compelling story right now. Bits of their product offering resonate with customers, but only really around those functions that are increasingly aped by other providers from beyond the PaaS world.

The broader story, of deep integration and easy orchestration, raises as many red flags as it does smiles of welcome. Is it about simplicity or loss of controlintegration or lock in? At a time when public, private and hybrid cloud implementations are becoming more mainstream, more mission-critical, and more capable, I hear far more concern expressed about relying upon PaaS than I do about relying upon a cloud infrastructure provider or a SaaS product vendor. Which isn’t to say that those cloud builders don’t need PaaS-like capabilities. They do. They’re just (increasingly) looking elsewhere to find them.

And proponents of PaaS are evolving, too, perhaps faster than the companies with which they were once associated. One of my meetings during a trip to San Francisco earlier this month was with Derek Collison. Formerly CTO Cloud Platforms at VMware (and intimately involved in the incubation of Cloud Foundry), Collison is now CEO of Apcera. Barb Darrow commented as Apcera emerged from stealth last month,

The company describes Continuum as an IT platform that ‘blends the delivery models of IaaS, PaaS, and SaaS’ but overlays (underlays?) them all with technology that handles policy. PaaS is great for developers, according to the blog post, but it’s not enough to deliver applications for grown-up companies that must deal not just with technology but with with compliance and regulatory rules and regs.

(my emphasis)

Collison talks compellingly about the need to move beyond separate consideration of infrastructure, integration and deployment capabilities, and the application. Instead, he sees a continuum of capabilities with different levels of abstraction suited to meeting the real (and complex) requirements of an enterprise and its hybrid IT estate. Policy, Collison argues, “must be a core part of the DNA” in order to truly meet the business needs of an organisation. It’s early days for Apcera, and it remains to be seen whether this is a truly new take on the space or simply a more enterprise-friendly reframing of the problem.

So, is there a future for today’s PaaS companies? It sometimes seems unlikely that they can keep doing what they’re doing and make enough money to grow sustainably. Will they be rendered irrelevant by the increasing capability of offerings above and below them in the stack? Will new and more integrated offerings such as Apcera’s eat their lunch? Or can they rise, Phoenix-like, from the ashes of current business models to meet a broader set of business requirements? If they do, how recognisable will their new incarnations be?

Image of a Phoenix from the 15th Century Nuremberg Chronicle. Public Domain image shared with Wikimedia Commons.

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database.

He blogs at www.cloudofdata.com.

Latest Stories
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
An edge gateway is an essential piece of infrastructure for large scale cloud-based services. In his session at 17th Cloud Expo, Mikey Cohen, Manager, Edge Gateway at Netflix, detailed the purpose, benefits and use cases for an edge gateway to provide security, traffic management and cloud cross region resiliency. He discussed how a gateway can be used to enhance continuous deployment and help testing of new service versions and get service insights and more. Philosophical and architectural ap...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?
By 2021, 500 million sensors are set to be deployed worldwide, nearly 40x as many as exist today. In order to scale fast and keep pace with industry growth, the team at Unacast turned to the public cloud to build the world's largest location data platform with optimal scalability, minimal DevOps, and maximum flexibility. Drawing from his experience with the Google Cloud Platform, VP of Engineering Andreas Heim will speak to the architecture of Unacast's platform and developer-focused processes.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in development and launches of disruptive technologies to create new market opportunities as well as enhance enterprise product portfolios with emerging technologies. His most recent venture was Octoblu, a cross-protocol Internet of Things (IoT) mesh network platform, acquired by Citrix. Prior to co-founding Octoblu, Chris was founder of Nodester, an open-source Node.JS PaaS which was acquired by AppFog and ...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.