Welcome!

Blog Feed Post

How Often Should You Check Your Backups?

We are regularly told that checking our own bodies for signs of change is a good thing.  Early diagnosis of disease gives more of a fighting chance of curing the problem.  So, in the IT world, where we assume all of our backups have been taken successfully, how often should we be checking the results and ensuring the backup will work on the fateful day we need to do a restore?  This question was posed by Federica Monsone on Twitter this week.  Here’s an attempt to provide an answer.

First of all, let’s consider the whole point of taking backups.  Excluding the inappropriate use of backup for archiving, the backup process is there to ensure you can maintain continuous access to your data in the event of unforseen circumstances.  Usually (but not exclusively) these are data loss due to equipment or power failure, data corruption (whether software bug or malicious), accidental deletion or a need to return to a previous point in time for consistency purposes where there are multiple interrelated systems.

Backups will be used infrequently and inevitably, like insurance, you never know how good your backups are until you come to use them.  I wouldn’t advocate crashing your car just to check your insurers will pay out, however periodic validation of backups and more importantly – the restore process - is a good thing.  Why?  Because in a recovery scenario, you want to be confident your backups have worked and that the process of recovery will be as smooth as possible.  Recovering data is typically a time-critical operation.  You’re recovering data because somebody needs the information quickly, or because a system is down.  When the pressure is on, the process needs to work flawlessly.  In addition to the time pressures, restores should be periodically checked because:

  • Backup media deteriorates over time; you should be ensuring any failing media is replaced.
  • Backup software upgrades can cause issues with restores is data formats are changed.
  • Server software upgrades can cause issues with restores.

So, to the heart of the matter, how often to test restores.  I believe restore testing should be based on the criticality of the data and of the complexity of the backup infrastructure.  So, if data integrity is the essence of your business, test restores more often.  If you have a shared backup infrastructure, test restores from that; if you have a more distributed design, you’ll need to test each backup component.  Here are some thoughts:

  • Test restores of individual files on a weekly basis
  • Test restores of large volumes of data on a monthly basis
  • Test whole system restores 1/2 yearly
  • Randomly select media for restore; choose new and old media alike
  • Test restores into your DR site (if you have one)
  • Replace faulty media immediately
  • Have a media retirement policy
  • Have a backup onboarding policy

There’s no right or wrong way to approach testing restores; it’s all about building confidence in the restore process, so when you need it, you can be happy it will work for you.

Google Buzz

Read the original blog entry...

Latest Stories
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
Authorization of web applications developed in the cloud is a fundamental problem for security, yet companies often build solutions from scratch, which is error prone and impedes time to market. This talk shows developers how they can (instead) build on-top of community-owned projects and frameworks for better security.Whether you build software for enterprises, mobile, or internal microservices, security is important. Standards like SAML, OIDC, and SPIFFE help you solve identity and authenticat...
The digital transformation is real! To adapt, IT professionals need to transform their own skillset to become more multi-dimensional by gaining both depth and breadth of a wide variety of knowledge and competencies. Historically, while IT has been built on a foundation of specialty (or "I" shaped) silos, the DevOps principle of "shifting left" is opening up opportunities for developers, operational staff, security and others to grow their skills portfolio, advance their careers and become "T"-sh...
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
Hackers took three days to identify and exploit a known vulnerability in Equifax’s web applications. I will share new data that reveals why three days (at most) is the new normal for DevSecOps teams to move new business /security requirements from design into production. This session aims to enlighten DevOps teams, security and development professionals by sharing results from the 4th annual State of the Software Supply Chain Report -- a blend of public and proprietary data with expert researc...
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, softwar...
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
This session will provide an introduction to Cloud driven quality and transformation and highlight the key features that comprise it. A perspective on the cloud transformation lifecycle, transformation levers, and transformation framework will be shared. At Cognizant, we have developed a transformation strategy to enable the migration of business critical workloads to cloud environments. The strategy encompasses a set of transformation levers across the cloud transformation lifecycle to enhance ...
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Ge...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...