Welcome!

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Cloud Backup & Disaster Recovery Predictions for 2014

Cloud backup adoption made great strides in 2013 but it was also a year of learning lessons

Author: Nick Mueller, Zetta.net

2013 was a banner year for cloud backup adoption. It was also a year of wake-up calls: simple cloud backup doesn't constitute DR, transfer speeds are vital, and beware cloud as a commodity. Let's look to 2014 for more trends in cloud backup and DR in the cloud.

1. Cloud backup vendors with slow performance will fail
This trend showed up in the news as well-known cloud backup vendors ceased production. Symantec Backup Exec Cloud was the biggest casualty of slow performance, and more cloud hosting products struggled or went down because they were not optimized for speed of backup or recovery.

Just offering a backup to cloud option isn't enough anymore. Users appreciate the scalability and cost-effectiveness of the cloud but they also want the same level of performance in backup and restore that they had on-premise. Native cloud optimization for the Internet and high data transfer speeds are the only way to achieve this performance level.

2. MSPs and VARs Will Embrace Cloud Backup as a Profit Center
Many MSPs and VARs want to offer Disaster Recovery as a Service (DRaaS) because their customers want it. But profit margins are thin and it's tough to bring in enough revenue to make a profit, let alone invest in higher priced services. MSPs know they need to raise recurring revenue and lower client backup management costs, and they need a cost-effective and high performance service to make that happen. MSPs can build recurring revenue by offering features like these:

  • Optimize cloud backup and recovery so the MSP's service is at least as fast as their customers' on-premise backup.
  • Earn customer trust with verified backup, which takes the burden of continual checking off their customers' backs.
  • Offer a high performance cloud without high cost to the MSP. This means a solution that's natively optimized for the cloud without the costs of a hardware appliance.

Profit margins grow with less cost on the backend and the MSP sees recurring revenue from happy customers. MSPs and VARs can afford to develop premium services and faster go-to-market.

3. Appliance-Based Backup Solutions Show their Age
A lot of traditional backup vendors extended backup to the cloud using on-premise appliances. Early appliances were an innovative way to collect backup data and then transfer it to the cloud; they were also a good way for backup vendors with big installed bases to keep their customers.

The problem is that an on-premise appliance can be an expensive proposition and does little to accelerate cloud performance. This is particularly awkward if you are an MSP: customer site hardware appliances need repair and replacement, and software appliances need troubleshooting and upgrades.

You can replace appliances with a cloud-native cloud backup offering with features like WAN optimization and multi-channel communications like REST. This takes the burden of supporting appliances off the table and speeds up performance.

4. Commodity Cloud Doesn't Work Unless Your Name is Azure, Amazon or Google
Big established cloud hosts make money with their huge economies of scale and the volume of data they have under management. Otherwise commodity cloud is a losing game as Nirvanix can attest. Nirvanix declared Chapter 11 because they decided to build their own cloud infrastructure. In spite of big clients like IBM they could not boost enough revenue to counter their astronomical expenditure.

Moving into 2014, the brightest opportunities in the cloud are specific business solutions focused on businesses that are willing to pay for them. Disaster recovery in the cloud is one of the most attractive offerings moving forward. True cloud DR isn't just backup to the cloud. It's specifically architected and custom built for enterprise-grade backup and recovery performance.

Zetta for example is specifically built for the Internet with WAN optimization and accelerated cloud performance, and achieves speeds that are often faster than local backup.

5. Companies Realize that Cloud Backup Means Cloud Recovery Too
Recovering data over a slow pipe can be even worse than the initial backup: just when companies need to recover their data fast, slow data transfer speeds threaten the whole recovery process.

Commodity cloud storage from legacy backup software won't work for companies with high performance recovery needs. Recovery from the cloud is all about RTO and that's just what legacy backup to the cloud can't do. Some vendors get around the problem by downloading customer recovery files on disk and trucking them to the customer site. Better than an impossibly long recovery time over the WAN but hardly the stuff that dreams are made of.

Companies avoid this problem by only backing up data that they can stand to recover in a longer period of time. They can't apply the economies of the cloud for priority data that is constrained by recovery time, unless they turn to an optimized cloud backup and DR offering. We'll see a lot more of this customer movement to cloud-native services in 2014.

We built these predictions for 2014 on 2013 events and Zetta customer needs and wants.

Nick is Zetta's Chief Content Officer, and has been working with writing and social media teams to create digital content since the days when the BBS reign

More Stories By Derek Kol

Derek Kol is a technology specialist focused on SMB and enterprise IT innovations.

Latest Stories
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
While many government agencies have embraced the idea of employing cloud computing as a tool for increasing the efficiency and flexibility of IT, many still struggle with large scale adoption. The challenge is mainly attributed to the federated structure of these agencies as well as the immaturity of brokerage and governance tools and models. Initiatives like FedRAMP are a great first step toward solving many of these challenges but there are a lot of unknowns that are yet to be tackled. In hi...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.