|By Ed Featherston||
|April 14, 2014 09:45 AM EDT||
There are things we tend to take for granted in our everyday lives. We have certain expectations that don't even have to be spoken, they're just a given. If you walk into a room and turn on the light switch, the lights will go on, it's assumed. If you turn the water faucet on, water will come out; if you pick up the telephone, there will be a dial tone. The concept of any of those things not happening does not enter the conversation. These are services we have that are ubiquitous; we don't even think about them - they are just there.
In recent years people have seen the impact Mother Nature has had on those core services such as electricity, water and phone, Storms, hurricanes, floods and blizzards have taken our expectations of these services and turned them on their head.
Cloud Computing, the New Light Switch
Cloud computing has become pervasive in both our personal and business lives; you cannot have a conversation about technology without the word "cloud" in it.
On a personal level, our music players are streaming from the cloud, our tablets and eReaders are getting books from the cloud, our TVs are streaming video from the cloud and our smart phones and PCs are being backed up to the cloud. Google has glasses that connect you to the cloud and Samsung just came out with a watch that connects you to the cloud. Like the electricity and water in your home, the cloud is always there - at least that has become the perception and expectation.
On a business level, our expectations are influenced by our personal exposure and experiences with technology. There is an assumption that by going to the cloud, the services provided will always be there, like the light switch.
Recent Heavy Weather in the Cloud
Cloud services and service providers do enhance those expectations. By dispersing applications across multiple servers and multiple data centers, the technology implementations allow for higher levels of fault tolerance. The risk is that the higher levels of complexity needed to implement these infrastructures introduce new potential ‘technology storms' that can expose a business to unexpected failures and outages.
One need only read the headlines of public cloud outages over the last year whether it be NASDQ, Amazon, Google, and numerous other providers to understand that going to the cloud does not come with 100% availability, and that comes with a cost.
- In January of this year, DropBox experienced an outage due to a ‘routine maintenance episode' on a Friday evening. Customers experienced 2-5 hour loss of access to services, some lasting into the weekend.
- In August of last year, NASDAQ was shut down for 3 hours. The root cause was determined to be a ‘data flood' on requests that peaked at 26,000/sec, (26 times normal volumes) that exposed a software flaw that prevented the fail-safes from being triggered to allow operations to continue.
- In that same month, Google experienced an outage of their services that only lasted 4 minutes. In that short period of time, Internet traffic dropped by 40%. (The fact the outage only lasted 4 minutes speaks well of Google's recovery plans and services.)
- On January 31st, 2013, Amazon had an outage that lasted only 49 minutes. The estimated cost to Amazon in lost sales for that 49 minutes is estimated to be between $4-$5M dollars. (Several other companies that utilize Amazon's services, such as Netflix, also experienced the impact of this outage.)
- As far back as two years ago, a large portion of the State of Maryland's IT services to the public were down for days due to a double failure in the storage sub-systems and their failover systems. No system is immune.
Planning for Availability and Recoverability
Going to the cloud does not in and of itself provide high availability and resiliency. Like any technology architecture, these capabilities need to be designed in and come with a cost. Higher availability has always required more effort and associated costs, and going to the cloud alone does not necessarily provide what your business is expecting from that light switch.
When moving to cloud architectures, whether they are public or private, business needs and expectations around availability and resiliency must be defined and understood. You cannot take for granted that by being in the cloud the needs will be met. Due diligence must still be performed.
- When going to the public clouds, you need to make sure the availability requirements from the business are included in the SLAs with the cloud vendor.
- When building a private cloud network, it is incumbent on the IT organization to ensure the needs and requirements are baked into the design and implementation of that infrastructure, and that expectations with the business are properly set and understood.
- Risk mitigation plans need to be developed and in place before outages occur, as even the best infrastructure may still have a failure (such as the State of Maryland). Going to the cloud does not negate the need to develop and have a business continuity plan.
- If working with a public cloud provider, this is a joint effort, not solely the vendor's responsibility or yours. Vendors will have their own set of plans, and you must dovetail yours with theirs. Make sure you understand what they have in place before signing on the dotted line.
No technology negates the need for proper planning and the cloud is no different. Ultimately, weathering the technological natural disasters in the cloud is accomplished just like we weather those of Mother Nature, prepare a plan, so when the storm does hit, you can make it out the other side.
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Aug. 29, 2015 03:00 AM EDT Reads: 461
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Aug. 28, 2015 11:45 PM EDT Reads: 400
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Aug. 28, 2015 11:30 PM EDT Reads: 825
Red Hat is investing in Tesora, the number one contributor to OpenStack Trove Database as a Service (DBaaS) also ranked among the top 20 companies contributing to OpenStack overall. Tesora, the company bringing OpenStack Trove Database as a Service (DBaaS) to the enterprise, has announced that Red Hat and others have invested in the company as a part of Tesora's latest funding round. The funding agreement expands on the ongoing collaboration between Tesora and Red Hat, which dates back to Febr...
Aug. 28, 2015 09:00 PM EDT Reads: 290
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
Aug. 28, 2015 07:45 PM EDT Reads: 174
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Aug. 28, 2015 06:00 PM EDT Reads: 311
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and a...
Aug. 28, 2015 05:30 PM EDT Reads: 405
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Trel...
Aug. 28, 2015 03:45 PM EDT Reads: 168
Cloud and datacenter migration innovator AppZero has joined the Microsoft Enterprise Cloud Alliance Program. AppZero is a fast, flexible way to move Windows Server applications from any source machine – physical or virtual – to any destination server, in any cloud or datacenter, using its patented container technology. AppZero’s container is also called a Virtual Application Appliance (VAA). To facilitate Microsoft Azure onboarding, AppZero has two purpose-built offerings: AppZero SP for Azure,...
Aug. 28, 2015 03:15 PM EDT
WSM International, the pioneer and leader in server migration services, has announced an agreement with WHOA.com, a leader in providing secure public, private and hybrid cloud computing services. Under terms of the agreement, WSM will provide migration services to WHOA.com customers to relocate some or all of their applications, digital assets, and other computing workloads to WHOA.com enterprise-class, secure cloud infrastructure. The migration services include detailed evaluation and planning...
Aug. 28, 2015 03:01 PM EDT
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Aug. 28, 2015 02:15 PM EDT Reads: 413
A producer of the first smartphones and tablets, presenter Lee M. Williams will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. In his session at @ThingsExpo, Lee Williams, COO of ETwater, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ET...
Aug. 28, 2015 02:00 PM EDT
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
Aug. 28, 2015 12:30 PM EDT Reads: 125
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
Aug. 28, 2015 12:30 PM EDT Reads: 146
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
Aug. 28, 2015 12:30 PM EDT