Welcome!

Related Topics: @CloudExpo, Java IoT, Linux Containers, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Blog Post

Storms in the Cloud

No technology negates the need for proper planning and the cloud is no different

There are things we tend to take for granted in our everyday lives. We have certain expectations that don't even have to be spoken, they're just a given. If you walk into a room and turn on the light switch, the lights will go on, it's assumed. If you turn the water faucet on, water will come out; if you pick up the telephone, there will be a dial tone. The concept of any of those things not happening does not enter the conversation. These are services we have that are ubiquitous; we don't even think about them - they are just there.

In recent years people have seen the impact Mother Nature has had on those core services such as electricity, water and phone, Storms, hurricanes, floods and blizzards have taken our expectations of these services and turned them on their head.

Cloud Computing, the New Light Switch
Cloud computing has become pervasive in both our personal and business lives; you cannot have a conversation about technology without the word "cloud" in it.

On a personal level, our music players are streaming from the cloud, our tablets and eReaders are getting books from the cloud, our TVs are streaming video from the cloud and our smart phones and PCs are being backed up to the cloud. Google has glasses that connect you to the cloud and Samsung just came out with a watch that connects you to the cloud. Like the electricity and water in your home, the cloud is always there - at least that has become the perception and expectation.

On a business level, our expectations are influenced by our personal exposure and experiences with technology. There is an assumption that by going to the cloud, the services provided will always be there, like the light switch.

Recent Heavy Weather in the Cloud
Cloud services and service providers do enhance those expectations. By dispersing applications across multiple servers and multiple data centers, the technology implementations allow for higher levels of fault tolerance. The risk is that the higher levels of complexity needed to implement these infrastructures introduce new potential ‘technology storms' that can expose a business to unexpected failures and outages.

One need only read the headlines of public cloud outages over the last year whether it be NASDQ, Amazon, Google, and numerous other providers to understand that going to the cloud does not come with 100% availability, and that comes with a cost.

  • In January of this year, DropBox experienced an outage due to a ‘routine maintenance episode' on a Friday evening. Customers experienced 2-5 hour loss of access to services, some lasting into the weekend.
  • In August of last year, NASDAQ was shut down for 3 hours. The root cause was determined to be a ‘data flood' on requests that peaked at 26,000/sec, (26 times normal volumes) that exposed a software flaw that prevented the fail-safes from being triggered to allow operations to continue.
  • In that same month, Google experienced an outage of their services that only lasted 4 minutes. In that short period of time, Internet traffic dropped by 40%. (The fact the outage only lasted 4 minutes speaks well of Google's recovery plans and services.)
  • On January 31st, 2013, Amazon had an outage that lasted only 49 minutes. The estimated cost to Amazon in lost sales for that 49 minutes is estimated to be between $4-$5M dollars. (Several other companies that utilize Amazon's services, such as Netflix, also experienced the impact of this outage.)
  • As far back as two years ago, a large portion of the State of Maryland's IT services to the public were down for days due to a double failure in the storage sub-systems and their failover systems. No system is immune.

Planning for Availability and Recoverability
Going to the cloud does not in and of itself provide high availability and resiliency. Like any technology architecture, these capabilities need to be designed in and come with a cost. Higher availability has always required more effort and associated costs, and going to the cloud alone does not necessarily provide what your business is expecting from that light switch.

When moving to cloud architectures, whether they are public or private, business needs and expectations around availability and resiliency must be defined and understood. You cannot take for granted that by being in the cloud the needs will be met. Due diligence must still be performed.

  • When going to the public clouds, you need to make sure the availability requirements from the business are included in the SLAs with the cloud vendor.
  • When building a private cloud network, it is incumbent on the IT organization to ensure the needs and requirements are baked into the design and implementation of that infrastructure, and that expectations with the business are properly set and understood.
  • Risk mitigation plans need to be developed and in place before outages occur, as even the best infrastructure may still have a failure (such as the State of Maryland). Going to the cloud does not negate the need to develop and have a business continuity plan.
  • If working with a public cloud provider, this is a joint effort, not solely the vendor's responsibility or yours. Vendors will have their own set of plans, and you must dovetail yours with theirs. Make sure you understand what they have in place before signing on the dotted line.

No technology negates the need for proper planning and the cloud is no different. Ultimately, weathering the technological natural disasters in the cloud is accomplished just like we weather those of Mother Nature, prepare a plan, so when the storm does hit, you can make it out the other side.

More Stories By Ed Featherston

Ed Featherston is a director/senior enterprise architect at Collaborative Consulting. He brings 35 years of technology experience in designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, and cloud technologies. He has delivered projects in various industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...