Welcome!

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Feed Post

Why Cloud Computing Must Be Included in Disaster Recovery Planning

It is of little surprise that IT is viewed as both a key enabler of risk management and a key threat

It is now official. 2010, according to the United Nations, was one of the deadliest years for natural disasters experienced over the past two decades.

Statistics released yesterday are both shocking and heartbreaking in equal measure. Some 373 natural disasters claimed the lives of more than 296,800 people last year, affecting nearly 208 million people at an estimated cost of nearly $110 billion. To put this in perspective the loss of life equates to losing the entire population of a UK city the size of Nottingham or Leicester.

The research was compiled by the Centre for Research on the Epidemiology of Disasters (CRED) of the Université catholique de Louvain in Belgium, and supported by the UN International Strategy for Disaster Reduction (UNISDR).Unfortunately according to the same report, the disasters that befell 2010 may just be the start of an unwelcome trend. Indeed the U.N. assistant secretary-general for disaster risk reduction, Margareta Wahlström, stated that last year’s figures may simply be viewed as benign in years to come.

“Unless we act now, we will see more and more disasters due to unplanned urbanization and environmental degradation. And weather-related disasters are sure to rise in the future, due to factors that include climate change.”

Ms. Wahlström then moves on to state: “that disaster risk reduction was no longer optional.”  “What we call ‘disaster risk reduction’ - and what some are calling ‘risk mitigation’ or ‘risk management’ - is a strategic and technical tool for helping national and local governments to fulfil their responsibilities to citizens.”

As if we needed further reminding of the impact that natural disasters can have on communities, at the same time as Ms.Wahlström was spearheading the UN’s media activities, the Royal Australian College of General Practitioners (RACGP) was issuing guidance to help GP’s overcome the challenge of restoring Information Technology functionality after the recent Queensland floods.

Professor Claire Jackson, RACGP President and GP in Brisbane, stated: “This will be a time that will test the disaster recovery and business continuity planning of many general practices. The RACGP’s IT disaster recovery flowchart and fact sheet will provide guidance with the often technically difficult challenges of restoring IT systems and procedures to their full functionality.”

It is a sad fact of life that once the world’s media turns its cameras towards the next news story, we soon forget about the impact that a disaster has had on a particular community, and for some perverse reason, no matter how graphic the images or how sad the stories, we move on, safe in the knowledge that ‘it can’t happen to us.’And whilst for a whole host of geographic or socio economic reasons, the majority of us will never thankfully, experience the pain and devastation that a large scale natural disaster can bring, everyone of us can, and will, suffer some type of ‘personal disaster’ for which we could have taken some basic precautionary measures thus negating their effects.

As the world, and its citizens, is ever reliant on technology to function on a day to day basis, it is of little surprise that IT is viewed as both a key enabler of risk management and a key threat. We can assume that all of the Queensland GP’s patients’ medical records were stored electronically to improve both service and process delivery, and can only hope that they were not stored in a single data centre location that has now been washed away.

How many times have you heard people complain that they are ‘suffering the day from hell’ as their laptop has been stolen/lost/damaged and they’ve lost all their work/photos/contacts (delete as appropriate)?, and for some bizarre reason, we are supposed to empathise with them. Much in the same way that folks without basic desktop protection are filled with indignation that someone with a dubious moral compass has dared plant a Trojan on their PC or attempted to steal their bank account details.

In the recent past, two of the reasons oft quoted by business and consumers for not to taking disaster recovery too seriously, has been cost and complexity. And who could argue with the business logic? Purchasing and maintaining a full set of servers to mirror an existing infrastructure could be viewed as an expensive overhead, particularly if considered simply an ‘insurance policy’ that will in all probability never be ‘claimed.’ Many an IT department’s justifiable claims for additional DR Capex have fallen on ‘deaf ears’ over the years - usually dismissed out of hand as ‘expense for expense’ sake.

Likewise consumers have probably been baffled by the thought of regularly archiving and backing up their treasured holiday snaps. Where too and how?

As stated earlier, technology does have a major role to play in disaster recovery.

Cloud computing is now being considered as a genuine viable disaster recovery/business continuity solution at all levels and for all markets. According to the Cloud Convergence Council, cloud service providers are currently reporting that one in four end-users are asking for cloud-based disaster recovery and backup services.

The reason is simple. As mentioned previously, you don’t need DR services until something goes wrong - making them a cost centre, not a profit center. Moving disaster recovery into the cloud can deliver greater efficiency, security, scalability, and much desired cost savings.

There can be no doubt that the ability to store, back up and archive data to an offsite third party, via the cloud, is compelling but as with anything ‘risk’ related there are several considerations to make before deciding upon a solution.

You will need to consider whether you want your data to be held within a physical geographical boundary. If you do, you will need to ensure that you contract with a cloud provider who can guarantee that their data centre is within your desired territory. You will also need to know how and where your data is stored, once it is in your provider’s cloud. Will it be stored within one physical location, thereby increasing the risk of a single point of failure, or will it be distributed across nodes in more than one data centre? You may be happy that the providers ‘resilience’ simply involves separate racks in separate data halls, in which case you will need to be convinced that their site is served with diverse power supplies, and is free from potential natural hazards etc.

On the other hand you may feel that this level of cloud service equates to ‘all eggs in one basket’ approach, in which case you might opt for a cloud supplier with multiple data centres offering multiple resilience options across the DC estate. This will be in all likelihood a more expensive option but one which less ‘risky.’

This leads us to the question of cost. Many cloud services are charged on a per MB, GB or TB usage basis, which can make predictable budgeting a challenge. One blue chip company that recently considered moving to cloud for data replication estimated that it would cost them, over a period of three years, $55,000 more when compared with running a comparable in house system that would regularly and automatically fail over as required, due to the variable nature of the cloud provider’s billing.

Once again, you should seek to find a cloud provider that can provide some element of inclusive fixed pricing/packaging or will provide you with an agreed fixed tiered pricing model.

Finally, and most importantly, seek a provider that offers cast iron service level agreements for cloud. If their marketing blurb states that they can have you ‘fully restored and running’ within an hour, determine exactly what, contractually, ‘fully restored and running’ means and a definition of an ‘hour’ might also be useful.

As with any form of insurance policy, the more ticks you place in the boxes - the greater the degree of protection - the higher the premium. You should not expect a full cloud ‘belt and braces’ DR solution to be cheap. Total peace of mind does cost.

No one has yet stated that the cloud will solve every disaster recovery/business continuity issue, but it should certainly be considered as a workable solution to an age old problem.

It would however appear to be somewhat ironic that we are turning to a meteorologically named technological DR solution in a week when we are being advised to plan better as “weather-related disasters are sure to rise in the future.”

Read the original blog entry...

More Stories By Phil Worms

Phil is a 30 year IT industry veteran with a passion for education and has personally led many school and higher education initiatives designed to engage young people and showcase the broad range of exciting and fulfilling roles in IT.

A full and varied career has seen Phil move through various senior product/project and marketing positions with companies as diverse as Centrica plc, One.Tel, VarTec Telecom and iomart Group plc. Phil is working on a project to create an intergenerational social hub that will celebrate creativity and achievement in Helensburgh, birthplace of television pioneer John Logie Baird.The Heroes Centre will provide people of all ages with the new media and content creation skills required to engage fully in the digital world. Follow his progress on Twitter and on Facebook

Latest Stories
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
Real IoT production deployments running at scale are collecting sensor data from hundreds / thousands / millions of devices. The goal is to take business-critical actions on the real-time data and find insights from stored datasets. In his session at @ThingsExpo, John Walicki, Watson IoT Developer Advocate at IBM Cloud, will provide a fast-paced developer journey that follows the IoT sensor data from generation, to edge gateway, to edge analytics, to encryption, to the IBM Bluemix cloud, to Wa...
With the rise of DevOps, containers are at the brink of becoming a pervasive technology in Enterprise IT to accelerate application delivery for the business. When it comes to adopting containers in the enterprise, security is the highest adoption barrier. Is your organization ready to address the security risks with containers for your DevOps environment? In his session at @DevOpsSummit at 21st Cloud Expo, Chris Van Tuin, Chief Technologist, NA West at Red Hat, will discuss: The top security r...
SYS-CON Events announced today that Fusic will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Fusic Co. provides mocks as virtual IoT devices. You can customize mocks, and get any amount of data at any time in your test. For more information, visit https://fusic.co.jp/english/.
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reducti...
IBM helps FinTechs and financial services companies build and monetize cognitive-enabled financial services apps quickly and at scale. Hosted on IBM Bluemix, IBM’s platform builds in customer insights, regulatory compliance analytics and security to help reduce development time and testing. In his session at 21st Cloud Expo, Lennart Frantzell, a Developer Advocate with IBM, will discuss how these tools simplify the time-consuming tasks of selection, mapping and data integration, allowing devel...
Today traditional IT approaches leverage well-architected compute/networking domains to control what applications can access what data, and how. DevOps includes rapid application development/deployment leveraging concepts like containerization, third-party sourced applications and databases. Such applications need access to production data for its test and iteration cycles. Data Security? That sounds like a roadblock to DevOps vs. protecting the crown jewels to those in IT.
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
SYS-CON Events announced today that B2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. B2Cloud specializes in IoT devices for preventive and predictive maintenance in any kind of equipment retrieving data like Energy consumption, working time, temperature, humidity, pressure, etc.
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
Many organizations adopt DevOps to reduce cycle times and deliver software faster; some take on DevOps to drive higher quality and better end-user experience; others look to DevOps for a clearer line-of-sight to customers to drive better business impacts. In truth, these three foundations go together. In this power panel at @DevOpsSummit 21st Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, industry experts will discuss how leading organizations build application success from all...
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp em...
Elon Musk is among the notable industry figures who worries about the power of AI to destroy rather than help society. Mark Zuckerberg, on the other hand, embraces all that is going on. AI is most powerful when deployed across the vast networks being built for Internets of Things in the manufacturing, transportation and logistics, retail, healthcare, government and other sectors. Is AI transforming IoT for the good or the bad? Do we need to worry about its potential destructive power? Or will we...