|By Greg Schulz||
|January 31, 2014 01:16 AM EST||
Part III Until the focus expands to data protection - Taking action
By Greg Schulz
Part III - Until the focus expands to data protection - Taking action
This is the thrid of a three-part series (read part II here) about how vendors are keeping backup alive, however what they can and should do to shift and expand the conversation to data protection and related themes.
Modernizing is more than simply swapping one technology for another
As I have said for a couple of years now, modernizing data protection, or data protection modernization if you prefer is more than simply deduping or swapping out media, mediums, tape, disk, clouds, software or services like a recurring flat tire on an automobile. If you keep getting flat tires, instead of treating the symptom, find and fix the problem which means for backup, taking a step back and realizing that what is really being done is protecting data (e.g. data protection).
Granted the security people may not like sharing the term data protection as some of them prefer to keep that unique, just like some of the compliance people want to keep archiving exclusive to their focus areas, however lets move on.
On the other hand, data protection also means that, protect, preserve and enable data and information to be accessed and served when and were needed in a cost-effective way with consistency and coherency.
Sure there is still the act of making a copy or a backup at time intervals (frequency) with various coverage (how much gets copied) to multiple locations (copies) with versions kept for different amounts of time (retention) to support RTO and RPO, not to mention SLA and SLO for ITSM (how's that for some buzzword bingo ;).
This means using copies, sync (or rsync), snapshots, replication and CDP, discrete copies such as backups along with all the other buzzword bingo enabling tools, technologies and techniques (e.g. Agent or Agent less, Archive, Availability zones. Not to mention Bare metal, virtual bare metal, Block based, CDP, Compression, Consolidation, Deletion, Data management, Dedupe, eDiscovery, durability, erasure coding/parity, file level, meta data and policy management, replication, snapshots, RAID, plugin, object storage, NAS, VTL, disk, tape, cloud, virtual among others). In addition to taking a step back, this also means rethinking why, how, when, where data (and information) gets protected to meet various threat risks as well as diverse business requirements.
No tools in the toolbox (physical, virtual or cloud)
Part of the rethinking is expanding the focus from what are the tools, who makes what's, how do they work, their features and functions to how to use the tool or technology for different things.
Various tools (hardware, software, services) for different physical, virtual and cloud tasks
This is like going into a store like Lowe's or Home Depot and talking to the sales people their (ok, associates or team members) who can tell you everything thing there is to know about the tool or technology, however they can't tell you how to use it.
Sometimes you can get lucky and there will be somebody working at the tool (hardware or software) store who will ask you what you are trying to do and give you advice based on their experience of a different approach with another tool or tools and some supporting material or parts and supplies.
Does this sound familiar to data infrastructure or IT in general, not to mention server, storage, backup and data protection among other areas of interest?
If all you have, or know how to use is a hammer, then everything or situation starts to look like a nail. Expand your toolbox with more tools AND learn how to use or apply them in new and different ways. Align the right tool, technology and technique to the task at hand!
Expand from talking new technology to using new (and old) things in new ways
In addition to focusing on new tools and technology along with their associated terminologies across physical, virtual and cloud environments, it is also time to expand the discussion and awareness to using new (and old) things in new ways. This also means expanding the terminology from backup/restore to more comprehensive data protection as part of modernizing your environment.
For example some people (and vendors) use the term or phrase "Modernizing Data Protection" to mean swap out tape for disk, or disk for cloud, or one cloud for another cloud, or upgrade from one software version to another, or simply swap one vendors software or tool for another, yet continue to use it for all practical purposes in the same way. Sure, moving from hourly or daily copies to tape over to direct to disk and then either redeploying tape where it is better suited (streaming large amounts of data, powering off to save energy, e.g. deep cold archive). This also means leveraging fast random access to small files that need to be recovered (usually within first hours or days of being protected).
Aligning tools, technologies, techniques to various threat risk scenarios
Modernizing data protection (also known as transformation) also means recognizing that not everything is the same in the data center or information factory regardless of size, and that there are also different and evolving data access patterns. Another reason and trend to consider is that there is no such thing as an information recession and that people plus data are living longer as well as getting larger.
Expand your awareness and focus beyond simply knowing what the tools are and who makes them to how, when, where, why along with pros/cons of using them to discuss different situations. This means having multiple tools in your data protection toolbox as well as knowing how to use different tools for various tasks instead of always using a hammer. - GS @StorageIO
The data protection continuum, more than tools and technoligiues
Call to action, stop talking about it, start walking the talk
If you or somebody else is tired of hearing about backup, then stop complaining about it and take some action. Following are some things to expand your thinking, awareness, discussions and activities around modernizing data protection (and moving beyond traditional backup).
- Take a step back and check the basics or fundamentals of data protection which when enabled, allows your organization to move forward after a small or big incident (or disaster).
- Start thinking beyond backup tools and technologies (hardware, software, services) particular how its been done, to why it needs to be done, how can it be done differently.
- Revisit why you are protecting different things, realize that not everything is the same, so does that mean you have to protect everything the same way?
- Learn about how to use different tools and technologies which is different from learning about the tools, features and functions.
- Also keep in mind that a barrier is often people and process (along with organizational politics) that also result in new (and old) technologies being used in old ways.
- Think about using different tools and technologies in different e.g. hybrid ways.
- This means start using new (and old) tools, techniques, techniques in new ways, start to apply your return on innovation by using things to discuss issues, vs. simply using them for the sake of using them.
In addition to the above items, here are some added links on various topics and themes mentioned here:
Via StorageIOblog - Only You Can Prevent Cloud Data Loss, Cloud conversations: confidence, certainty and confidentiality, Modernizing data protection with certainty, More Data Footprint Reduction (DFR) Material, More modernizing data protection, virtualization and clouds with certainty, EMC Evolves Enterprise Data Protection with Enhancements and Data protection modernization, more than swapping out media.
Via Internet evolution - People, Not Tech, Prevent IT Convergence.
Closing comments (for now)
Now having said all of that, It would be unrealistic to think that we can simply overnight drop the term backup and switch to data protection, after all, we need backwards compatibility. However until the industry which means from vendors, their pundits (analyst, bloggers, consultants, evangelists), press/media, vars, investors and customers start thinking and speaking in the broader context of data protection, life beyond backup, guess what, we will still be talking about backup. Start calling it (e.g. backup) data protection and perhaps within a generation (or sooner), the term backup will have been ILM, compressed, deduped, tiered, spun down, put into deep cold archive storage to take a long REST on object storage with a NAS interface in a software defined hybrid virtualized cloud ;).
Watch for more data protection conversations about related trends, themes, technologies, techniques perspectives in my ongoing data protection diaries discussions (e.g. www.dataprotectiondiaries.com).
Ok, nuff said
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 03:30 AM EST Reads: 613
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 03:30 AM EST Reads: 919
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dec. 5, 2016 01:30 AM EST Reads: 728
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection. In his session at 18th Cloud Expo, Bradley Holt, a Developer Advocate with IBM Cloud Data Services, discussed...
Dec. 5, 2016 01:00 AM EST Reads: 3,325
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 5, 2016 12:45 AM EST Reads: 1,804
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dec. 5, 2016 12:45 AM EST Reads: 1,566
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Dec. 5, 2016 12:30 AM EST Reads: 6,079
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 5, 2016 12:15 AM EST Reads: 3,814
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Dec. 5, 2016 12:15 AM EST Reads: 1,158
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 11:45 PM EST Reads: 917
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Dec. 4, 2016 10:45 PM EST Reads: 1,670
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 10:45 PM EST Reads: 1,000
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 4, 2016 08:30 PM EST Reads: 1,805
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Dec. 4, 2016 07:00 PM EST Reads: 4,918
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 4, 2016 06:30 PM EST Reads: 2,175