|By Paul Miller||
|December 14, 2012 07:00 AM EST||
Hewlett Packard used its Discover event in Frankfurt last week to reassert the company’s cloud credentials. Public, private, hybrid; HP is painting pictures that encompass them all, whilst seeking to protect hardware revenues and reassure conservative executives at some of its largest and most profitable customers. But HP has been here before, making bold claims and telling people what they wanted to hear about an HP cloud upon which enterprises could depend. This time, will the company deliver?
Earlier this year, satirical news site The Onion took a cruel but funny swipe at HP’s cloud pretensions. HP, the sketch suggested, had the answers, the technology, and a lot of cloud. The company has done — and continues to do — a lot right in this space, but it really did bring this derision upon itself. Mixed messaging, repeated announcements of amazing new cloud services that never quite saw the light of day, an endless stream of apparent strategy U-turns that must surely have left long-time HP executives as dizzy as those trying to understand their intentions? None of this helped HP. But now, Windows Azure is apparently behind us. PalmOS (or whatever it’s called these days) is no longer a glue to bind hardware, peripherals, software and data together. Amazon is an inevitable piece of the whole. And at HP, the new story is one (more or less) of an OpenStack public cloud called HP Cloud (or HP Public Cloud), a VMware private cloud called Cloud System, and a professional services sell called Managed Cloud for Enterprise (which is messily spread across large swathes of HP’s dreadful website, with no obvious landing page to link here).
A public cloud
The biggest cloud news out of Discover was probably the General Availability (at last) of HP’s OpenStack-powered public cloud offering. In keynotes and workshops, it was somewhat surprising to see the extent to which OpenStack and other enabling technologies were not mentioned. This was HP’s cloud, and the implication was clearly that HP know-how was what made it tick. HP hardware, HP software, HP cleverness. None of the ‘Intel Inside’ co-branding, Microsoft Diamond Sponsor loviness or VMware strategic partner rhetoric for this open source project, it seems. But, more relevantly, also none of the recognition that other named open source projects like the various Linux distributions do receive from HP.
Given the rather raw state of some OpenStack components, HP engineers have been busy stitching pieces together, but I would have expected HP to be telling more of a story about portability, about interoperability, and about the breadth and depth of the OpenStack community that customers would be joining. That story wasn’t told, and you had to know where to look to find much mention of the elusive OpenStack at all.
One place, it must be said, where the company was far more forthcoming was in the private Coffee Talks arranged for us by the team at Ivy. In frank cloud discussions such as those with Christian Verstraete, Chris Purcell, Florian Otel and others, far more of the detail — and rationale — was laid on the table.
Pricing is competitive, and it will be interesting to see how HP moves forward here. HP’s public cloud makes plenty of sense for enterprise customers already using HP kit and services elsewhere. But will a startup or a non-customer choose the HP Cloud in preference to Amazon or Google or Rackspace?
They might, if the messaging is right. German cloud analyst René Büst asserted in Frankfurt that “the next Instagram would never choose to start and grow on the HP Cloud”, as Amazon has all the mind-share in the startup community. Does HP care enough about the world beyond existing enterprise accounts to accept René’s challenge and entice that next cool startup? Is it, frankly, worth their while when their entire selling and support machine is geared toward people in suits who value fancy lunches and a Christmas card far more than credit card sign-up and cost competitiveness?
A private cloud
HP’s private cloud offering has been around a little longer, but the company reiterated — and reinforced — messages originally delivered at the Las Vegas Discover a few months back; Cloud System supports ‘bursting’ of compute jobs from an enterprise’s own private cloud to external providers such as HP’s public cloud and Amazon. This is a capability that will become increasingly important as even the most conservative enterprise customers begin their gradual transition out of the data centre and into the cloud.
Whatever Amazon and Salesforce executives might say in public about “the false cloud” or the number of Fortune 100 companies happily doing something on their public cloud infrastructure, they and we know that this is going to be a long game. HP’s flagship customers will move. Eventually, they’ll move almost everything. But it will take a decade or more, and there’s plenty of time to sell a few more private clouds and an awful lot of servers and storage arrays before that day comes.
A recognition of Amazon
HP’s messaging no longer tries to persuade customers that it will always meet every one of their cloud needs. HP has products and solutions to offer, but it is recognising that it needs to fit into a complex mixed environment. The company also recognises that Amazon is an inevitable part of that environment, and that HP solutions need to augment and add value with respect to Amazon. Helping customers to use Amazon when it’s appropriate is a far more effective strategy, long term, than either denying Amazon’s existence or insisting that its solutions are not fit for enterprise consumption. Neither are true, and HP’s customers are smart enough to realise that.
The SLA is king, maybe
One area in which HP is trying to differentiate itself from Amazon is in terms of Service Level Agreements, and this should play well with an enterprise audience. Rather than necessarily worrying about what hardware cloud infrastructure runs on, or whether it’s located on-premise, in a known and audited off-premise location, or out there in the fuzziness of the unbounded public cloud, HP is telling a story that focuses far more upon level of service, level of resilience, etc. This makes a lot of sense. I often don’t actually care whether data runs on my own machines or not. What I care about is whether or not my compliance and business requirements are being met. So instead of choosing public or private, off-premise or on, it makes a lot more sense to think about the business and compliance requirements that a particular solution helps me meet. One solution (on or off-premise) may be more secure, more robust, more disaster resilient, and it will come with an SLA (and a price tag) to reflect that. Another (again, on or off-premise) may be more suited to general crunching of less sensitive data. It’ll be more prone to failure, and cheaper. We tend to assume that our own data centre is the logical home of the former, and that the public cloud is a pretty cost-effective way to handle the latter. That’s not necessarily true, and that’s why it’s refreshing to at least begin to think in more nuanced terms. Unfortunately, although HP execs planted these ideas during their keynotes, the follow-up material quickly fell back into public v private, dodgy commodity kit v HP ‘enterprise grade’ hardware, etc. And that’s a shame.
Gartner’s Lydia Leong takes a deeper look at HP’s latest SLAs, and suggests that they may not be living up to their own rhetoric either. There’s plenty of work still to do in this area, and an effective means of differentiating service and value propositions is long overdue.
Dell goes the other way
HP uses OpenStack for the company’s public cloud, and VMware sits beneath their private offerings. Speaking at Dell World this week, Michael Dell announced that his company is doing the exact opposite; Dell’s existing VMware-powered public cloud is to be joined by a private cloud offering powered by OpenStack.
The public and private offerings of HP and Dell certainly aren’t directly comparable, but it is interesting that the two companies have reached such superficially odd decisions. It even raises the prospect that a customer of HP’s private cloud may find it easier to move to Dell’s public cloud than to HP’s, and that a customer of Dell’s private cloud may find it easier to move workloads to HP’s public cloud than to stick with Dell. Odd at best, this should be raising eyebrows in both Round Rock and Palo Alto.
Will the Converged Cloud actually, you know, Converge?
HP has a lot to say about convergence, both in terms of their hardware business but also in the cloud. And yet, it can be surprisingly difficult to see how the public and private pieces of the HP cloud portfolio really fit together. More often than I’d have expected, HP staffers discussing either the public or private cloud offerings spoke as if theirs was the only cloud in HP-land. A slip of the tongue once, or perhaps twice, but this was repeated again and again and again in Frankfurt. The joined-up story, and the reality of customers starting in either HP Cloud or Cloud System before realising a need to embrace parts of the other doesn’t seem to be getting through on the ground.
HP is a big ship, with some smart people and some great technology. But if it doesn’t tell a single — compelling — story and back it up with an attractive business model, it’s toast.
I can’t remember who it was, but someone in Frankfurt remarked in passing that HP would come through its current troubles “because it had technical chops.” Sadly for HP, that is simply not true. You can have the best technology in the world. But without a defined (or creatable) market requirement, a viable business proposition, and some credible messaging, all of that amazing technology is just some very expensive scrap metal. And a fatal red stain, spreading across the balance sheet.
HP has the technical pieces. It has the people pieces. It has some of the business model pieces. It has parts of the compelling story. It’s time the company joined those together credibly, filled in the gaps, and stopped shooting itself in the foot.
At least starve The Onion of material, so its writers have to try a little harder next time.
Disclosure: acting on behalf of Hewlett Packard, Ivy Worldwide invited me to Discover and covered travel and expenses associated with the trip. There was no requirement that I write about HP, and no requirement that any coverage be favourable.
- At Discover, HP takes the beta sticker off its public cloud (venturebeat.com)
- HP Switches On Public Cloud, Thanks To OpenStack (readwrite.com)
- Hewlett-Packard Cloud Strategy Starts Coming Together (seekingalpha.com)
- When an HP cloud is not an HP cloud (and whether it matters) [GigaOM] (gigaom.com)
- Rackspace CTO: Services & OpenStack – Not Price – Key To Winning Cloud Computing (readwrite.com)
- What HP’s cloud chief wants you to know about HP’s cloud [GigaOM] (gigaom.com)
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
Oct. 21, 2016 11:00 AM EDT Reads: 898
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 21, 2016 10:50 AM EDT
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Oct. 21, 2016 10:45 AM EDT Reads: 1,631
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Oct. 21, 2016 10:30 AM EDT Reads: 504
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,217
Governments around the world are adopting Safe Harbor privacy provisions to protect customer data from leaving sovereign territories. Increasingly, global companies are required to create new instances of their server clusters in multiple countries to keep abreast of these new Safe Harbor laws. Is it worth it? In his session at 19th Cloud Expo, Adam Rogers, Managing Director of Anexia, Inc., will discuss how to keep your data legal and still stay in business.
Oct. 21, 2016 10:15 AM EDT Reads: 1,367
Successful transition from traditional IT to cloud computing requires three key ingredients: an IT architecture that allows companies to extend their internal best practices to the cloud, a cost point that allows economies of scale, and automated processes that manage risk exposure and maintain regulatory compliance with industry regulations (FFIEC, PCI-DSS, HIPAA, FISMA). The unique combination of VMware, the IBM Cloud, and Cloud Raxak, a 2016 Gartner Cool Vendor in IT Automation, provides a co...
Oct. 21, 2016 09:30 AM EDT Reads: 1,122
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Oct. 21, 2016 09:30 AM EDT Reads: 415
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 21, 2016 09:15 AM EDT Reads: 230
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 21, 2016 08:45 AM EDT Reads: 11,115
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Oct. 21, 2016 08:45 AM EDT Reads: 1,360
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Oct. 21, 2016 08:15 AM EDT Reads: 1,835
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Oct. 21, 2016 08:00 AM EDT Reads: 5,572
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Oct. 21, 2016 07:45 AM EDT Reads: 3,728
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Oct. 21, 2016 07:45 AM EDT Reads: 2,054