|By Roger Strukhoff||
|February 14, 2014 06:30 PM EST||
Is the pendulum swinging back toward local datacenters running private cloud? Certainly some decisions I'm getting involved in indicate it is in my little part of the world.
But first let me ask, did the pendulum ever really swing toward public cloud? A survey conducted in Sept 2013 by Cloud Passage of SMBs and enterprises (more than 1,000 employees) found 21% deploying private cloud only and 13% deploying public clouds only, with 48% deploying both. The survey also found smaller businesses including public cloud more frequently.
Getting hard numbers on public vs. private is impossible. So given the one metric above and a few years' experience writing and researching the topic, I'll provide my take:
Initial enthusiasm was for public cloud taking over the world. The vision of Nic Carr's The Big Switch prevailed in cloud-related articles and speeches , in which computer resources were delivered and measured like water or electricity. This school of thought believed that there was no such thing as private cloud - if it was on-premise, it wasn't cloud.
VMware's success in virtualizing a couple billion dollars worth of datacenters per year refuted the Big Switch vision. Even though proponents have always been careful to say that virtualization alone is not cloud, it sure feels that way when your local resources are suddenly much more productive and running much hotter.
The era of hybrid cloud ensued.
Thesis, antithesis, synthesis. Kant (but not Hegel) would be proud.
I've long thought it would be very helpful if Jeff Bezos released the revenue figures for Amazon's public-cloud offerings. Surely he considers the secrecy of this information as part of his competitive advantage.
More important, I'd like to know, as a potential customer, Amazon's revenue and expenses, difficult as it may be to determine them. Because I'm getting the sneaking suspicion that not only should cost savings not be a reason to move to public cloud, but in fact, no such savings exist.
Total upfront cost aside, I made the opex vs. capex argument in favor of public cloud many times in the early days of a few years ago.
But now, I'm tasked with building a cloud for a startup with ambitious goals. The firm has capex, in fact, wants to spend money on capital expenditure because it's tangible and easily funded. In running the numbers, I've found that we may be able to build and operate our datacenter locally for less money over three years than to simply buy public cloud resources.
We'll also have the additional benefits of stimulating a local economy (in rural Northern Illinois) that needs it badly, while tapping into a large fiber-optic network that was just laid down throughout the region as part of an $85 million government program. We have all the brick-and-mortar, construction expertise, and bandwidth we need here. We can provide jobs every step of the way, including once we're up and running.
I'm evaluating a whole bunch (for lack of a more elegant, precise term) of alternatives for PaaS (to create the software that will drive the datacenter), and for a private-cloud infrastructure that makes performance and economic sense.
We can do this with just a single rack of servers to start - I'm not talking about recreating an NSA or Google site. But we can blast enough cyberkinetic energy into the tubes of the Internet to serve a very ambitious business plan with our own private cloud. If we need more, we have plenty of room, bandwidth (as I said already), and electricity.
And if we run short of processing at any step of the way, I'll just give Jeff Bezos a call and see if he has some extra-large instances to sell to me on an occasional basis.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Jan. 19, 2017 02:15 PM EST Reads: 1,035
@ThingsExpo has been named the ‘Top WebRTC Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @ThingsExpo ranked as the number one ‘WebRTC Influencer' followed by @DevOpsSummit at 55th.
Jan. 19, 2017 02:00 PM EST Reads: 4,763
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Jan. 19, 2017 01:15 PM EST Reads: 3,517
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jan. 19, 2017 01:15 PM EST Reads: 5,655
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 19, 2017 01:15 PM EST Reads: 5,148
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
Jan. 19, 2017 01:00 PM EST Reads: 3,738
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Jan. 19, 2017 12:45 PM EST Reads: 2,531
"Avere Systems is a hybrid cloud solution provider. We have customers that want to use cloud storage and we have customers that want to take advantage of cloud compute," explained Rebecca Thompson, VP of Marketing at Avere Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 19, 2017 12:45 PM EST Reads: 6,353
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Jan. 19, 2017 12:30 PM EST Reads: 1,184
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Jan. 19, 2017 12:15 PM EST Reads: 1,993
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Jan. 19, 2017 12:15 PM EST Reads: 4,297
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
Jan. 19, 2017 12:00 PM EST Reads: 1,888
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
Jan. 19, 2017 11:45 AM EST Reads: 1,701
Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies - speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating...
Jan. 19, 2017 11:30 AM EST Reads: 3,927
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 19, 2017 11:30 AM EST Reads: 4,344