|By Greg Schulz||
|January 31, 2014 01:16 AM EST||
Part III Until the focus expands to data protection - Taking action
By Greg Schulz
Part III - Until the focus expands to data protection - Taking action
This is the thrid of a three-part series (read part II here) about how vendors are keeping backup alive, however what they can and should do to shift and expand the conversation to data protection and related themes.
Modernizing is more than simply swapping one technology for another
As I have said for a couple of years now, modernizing data protection, or data protection modernization if you prefer is more than simply deduping or swapping out media, mediums, tape, disk, clouds, software or services like a recurring flat tire on an automobile. If you keep getting flat tires, instead of treating the symptom, find and fix the problem which means for backup, taking a step back and realizing that what is really being done is protecting data (e.g. data protection).
Granted the security people may not like sharing the term data protection as some of them prefer to keep that unique, just like some of the compliance people want to keep archiving exclusive to their focus areas, however lets move on.
On the other hand, data protection also means that, protect, preserve and enable data and information to be accessed and served when and were needed in a cost-effective way with consistency and coherency.
Sure there is still the act of making a copy or a backup at time intervals (frequency) with various coverage (how much gets copied) to multiple locations (copies) with versions kept for different amounts of time (retention) to support RTO and RPO, not to mention SLA and SLO for ITSM (how's that for some buzzword bingo ;).
This means using copies, sync (or rsync), snapshots, replication and CDP, discrete copies such as backups along with all the other buzzword bingo enabling tools, technologies and techniques (e.g. Agent or Agent less, Archive, Availability zones. Not to mention Bare metal, virtual bare metal, Block based, CDP, Compression, Consolidation, Deletion, Data management, Dedupe, eDiscovery, durability, erasure coding/parity, file level, meta data and policy management, replication, snapshots, RAID, plugin, object storage, NAS, VTL, disk, tape, cloud, virtual among others). In addition to taking a step back, this also means rethinking why, how, when, where data (and information) gets protected to meet various threat risks as well as diverse business requirements.
No tools in the toolbox (physical, virtual or cloud)
Part of the rethinking is expanding the focus from what are the tools, who makes what's, how do they work, their features and functions to how to use the tool or technology for different things.
Various tools (hardware, software, services) for different physical, virtual and cloud tasks
This is like going into a store like Lowe's or Home Depot and talking to the sales people their (ok, associates or team members) who can tell you everything thing there is to know about the tool or technology, however they can't tell you how to use it.
Sometimes you can get lucky and there will be somebody working at the tool (hardware or software) store who will ask you what you are trying to do and give you advice based on their experience of a different approach with another tool or tools and some supporting material or parts and supplies.
Does this sound familiar to data infrastructure or IT in general, not to mention server, storage, backup and data protection among other areas of interest?
If all you have, or know how to use is a hammer, then everything or situation starts to look like a nail. Expand your toolbox with more tools AND learn how to use or apply them in new and different ways. Align the right tool, technology and technique to the task at hand!
Expand from talking new technology to using new (and old) things in new ways
In addition to focusing on new tools and technology along with their associated terminologies across physical, virtual and cloud environments, it is also time to expand the discussion and awareness to using new (and old) things in new ways. This also means expanding the terminology from backup/restore to more comprehensive data protection as part of modernizing your environment.
For example some people (and vendors) use the term or phrase "Modernizing Data Protection" to mean swap out tape for disk, or disk for cloud, or one cloud for another cloud, or upgrade from one software version to another, or simply swap one vendors software or tool for another, yet continue to use it for all practical purposes in the same way. Sure, moving from hourly or daily copies to tape over to direct to disk and then either redeploying tape where it is better suited (streaming large amounts of data, powering off to save energy, e.g. deep cold archive). This also means leveraging fast random access to small files that need to be recovered (usually within first hours or days of being protected).
Aligning tools, technologies, techniques to various threat risk scenarios
Modernizing data protection (also known as transformation) also means recognizing that not everything is the same in the data center or information factory regardless of size, and that there are also different and evolving data access patterns. Another reason and trend to consider is that there is no such thing as an information recession and that people plus data are living longer as well as getting larger.
Expand your awareness and focus beyond simply knowing what the tools are and who makes them to how, when, where, why along with pros/cons of using them to discuss different situations. This means having multiple tools in your data protection toolbox as well as knowing how to use different tools for various tasks instead of always using a hammer. - GS @StorageIO
The data protection continuum, more than tools and technoligiues
Call to action, stop talking about it, start walking the talk
If you or somebody else is tired of hearing about backup, then stop complaining about it and take some action. Following are some things to expand your thinking, awareness, discussions and activities around modernizing data protection (and moving beyond traditional backup).
- Take a step back and check the basics or fundamentals of data protection which when enabled, allows your organization to move forward after a small or big incident (or disaster).
- Start thinking beyond backup tools and technologies (hardware, software, services) particular how its been done, to why it needs to be done, how can it be done differently.
- Revisit why you are protecting different things, realize that not everything is the same, so does that mean you have to protect everything the same way?
- Learn about how to use different tools and technologies which is different from learning about the tools, features and functions.
- Also keep in mind that a barrier is often people and process (along with organizational politics) that also result in new (and old) technologies being used in old ways.
- Think about using different tools and technologies in different e.g. hybrid ways.
- This means start using new (and old) tools, techniques, techniques in new ways, start to apply your return on innovation by using things to discuss issues, vs. simply using them for the sake of using them.
In addition to the above items, here are some added links on various topics and themes mentioned here:
Via StorageIOblog - Only You Can Prevent Cloud Data Loss, Cloud conversations: confidence, certainty and confidentiality, Modernizing data protection with certainty, More Data Footprint Reduction (DFR) Material, More modernizing data protection, virtualization and clouds with certainty, EMC Evolves Enterprise Data Protection with Enhancements and Data protection modernization, more than swapping out media.
Via Internet evolution - People, Not Tech, Prevent IT Convergence.
Closing comments (for now)
Now having said all of that, It would be unrealistic to think that we can simply overnight drop the term backup and switch to data protection, after all, we need backwards compatibility. However until the industry which means from vendors, their pundits (analyst, bloggers, consultants, evangelists), press/media, vars, investors and customers start thinking and speaking in the broader context of data protection, life beyond backup, guess what, we will still be talking about backup. Start calling it (e.g. backup) data protection and perhaps within a generation (or sooner), the term backup will have been ILM, compressed, deduped, tiered, spun down, put into deep cold archive storage to take a long REST on object storage with a NAS interface in a software defined hybrid virtualized cloud ;).
Watch for more data protection conversations about related trends, themes, technologies, techniques perspectives in my ongoing data protection diaries discussions (e.g. www.dataprotectiondiaries.com).
Ok, nuff said
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
Jan. 21, 2017 06:00 AM EST Reads: 2,066
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Jan. 21, 2017 05:30 AM EST Reads: 3,549
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Jan. 21, 2017 05:00 AM EST Reads: 4,024
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Jan. 21, 2017 05:00 AM EST Reads: 3,002
Technology vendors and analysts are eager to paint a rosy picture of how wonderful IoT is and why your deployment will be great with the use of their products and services. While it is easy to showcase successful IoT solutions, identifying IoT systems that missed the mark or failed can often provide more in the way of key lessons learned. In his session at @ThingsExpo, Peter Vanderminden, Principal Industry Analyst for IoT & Digital Supply Chain to Flatiron Strategies, will focus on how IoT depl...
Jan. 21, 2017 03:45 AM EST Reads: 1,985
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 21, 2017 02:30 AM EST Reads: 6,112
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors an...
Jan. 21, 2017 02:30 AM EST Reads: 5,041
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Jan. 21, 2017 02:15 AM EST Reads: 5,359
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it m...
Jan. 21, 2017 02:00 AM EST Reads: 5,816
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet and...
Jan. 21, 2017 01:30 AM EST Reads: 6,566
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 21, 2017 01:15 AM EST Reads: 4,905
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
Jan. 21, 2017 01:15 AM EST Reads: 5,052
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Jan. 21, 2017 01:15 AM EST Reads: 2,886
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Jan. 21, 2017 12:30 AM EST Reads: 4,578
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 21, 2017 12:15 AM EST Reads: 6,376