Welcome!

Related Topics: @CloudExpo, Java IoT, Linux Containers, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Blog Post

Storms in the Cloud

No technology negates the need for proper planning and the cloud is no different

There are things we tend to take for granted in our everyday lives. We have certain expectations that don't even have to be spoken, they're just a given. If you walk into a room and turn on the light switch, the lights will go on, it's assumed. If you turn the water faucet on, water will come out; if you pick up the telephone, there will be a dial tone. The concept of any of those things not happening does not enter the conversation. These are services we have that are ubiquitous; we don't even think about them - they are just there.

In recent years people have seen the impact Mother Nature has had on those core services such as electricity, water and phone, Storms, hurricanes, floods and blizzards have taken our expectations of these services and turned them on their head.

Cloud Computing, the New Light Switch
Cloud computing has become pervasive in both our personal and business lives; you cannot have a conversation about technology without the word "cloud" in it.

On a personal level, our music players are streaming from the cloud, our tablets and eReaders are getting books from the cloud, our TVs are streaming video from the cloud and our smart phones and PCs are being backed up to the cloud. Google has glasses that connect you to the cloud and Samsung just came out with a watch that connects you to the cloud. Like the electricity and water in your home, the cloud is always there - at least that has become the perception and expectation.

On a business level, our expectations are influenced by our personal exposure and experiences with technology. There is an assumption that by going to the cloud, the services provided will always be there, like the light switch.

Recent Heavy Weather in the Cloud
Cloud services and service providers do enhance those expectations. By dispersing applications across multiple servers and multiple data centers, the technology implementations allow for higher levels of fault tolerance. The risk is that the higher levels of complexity needed to implement these infrastructures introduce new potential ‘technology storms' that can expose a business to unexpected failures and outages.

One need only read the headlines of public cloud outages over the last year whether it be NASDQ, Amazon, Google, and numerous other providers to understand that going to the cloud does not come with 100% availability, and that comes with a cost.

  • In January of this year, DropBox experienced an outage due to a ‘routine maintenance episode' on a Friday evening. Customers experienced 2-5 hour loss of access to services, some lasting into the weekend.
  • In August of last year, NASDAQ was shut down for 3 hours. The root cause was determined to be a ‘data flood' on requests that peaked at 26,000/sec, (26 times normal volumes) that exposed a software flaw that prevented the fail-safes from being triggered to allow operations to continue.
  • In that same month, Google experienced an outage of their services that only lasted 4 minutes. In that short period of time, Internet traffic dropped by 40%. (The fact the outage only lasted 4 minutes speaks well of Google's recovery plans and services.)
  • On January 31st, 2013, Amazon had an outage that lasted only 49 minutes. The estimated cost to Amazon in lost sales for that 49 minutes is estimated to be between $4-$5M dollars. (Several other companies that utilize Amazon's services, such as Netflix, also experienced the impact of this outage.)
  • As far back as two years ago, a large portion of the State of Maryland's IT services to the public were down for days due to a double failure in the storage sub-systems and their failover systems. No system is immune.

Planning for Availability and Recoverability
Going to the cloud does not in and of itself provide high availability and resiliency. Like any technology architecture, these capabilities need to be designed in and come with a cost. Higher availability has always required more effort and associated costs, and going to the cloud alone does not necessarily provide what your business is expecting from that light switch.

When moving to cloud architectures, whether they are public or private, business needs and expectations around availability and resiliency must be defined and understood. You cannot take for granted that by being in the cloud the needs will be met. Due diligence must still be performed.

  • When going to the public clouds, you need to make sure the availability requirements from the business are included in the SLAs with the cloud vendor.
  • When building a private cloud network, it is incumbent on the IT organization to ensure the needs and requirements are baked into the design and implementation of that infrastructure, and that expectations with the business are properly set and understood.
  • Risk mitigation plans need to be developed and in place before outages occur, as even the best infrastructure may still have a failure (such as the State of Maryland). Going to the cloud does not negate the need to develop and have a business continuity plan.
  • If working with a public cloud provider, this is a joint effort, not solely the vendor's responsibility or yours. Vendors will have their own set of plans, and you must dovetail yours with theirs. Make sure you understand what they have in place before signing on the dotted line.

No technology negates the need for proper planning and the cloud is no different. Ultimately, weathering the technological natural disasters in the cloud is accomplished just like we weather those of Mother Nature, prepare a plan, so when the storm does hit, you can make it out the other side.

More Stories By Ed Featherston

Ed Featherston is a senior enterprise architect and director at Collaborative Consulting. He brings more than 34 years of information technology experience of designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, client/server, middleware, and cloud technologies, Ed has designed and delivered projects for a variety of industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...