Welcome!

News Feed Item

Study Finds Improved Copy Data Management Could Save Federal Government $16.5 Billion Wasted On Redundant Data

MeriTalk, a public-private partnership focused on improving the outcomes of government IT, today announced the results of its new report, “Consolidation Aggravation: Tip of the Data Management Iceberg.” The study, underwritten by Actifio, reveals that by 2024, agencies will spend as much as $16.5 billion storing redundant copies of non-production data – working directly against the Federal Data Center Consolidation Initiative. While Federal agencies have prioritized consolidation and transitioned to more efficient and agile cloud-based systems, 72 percent of Federal IT managers said their agency has maintained or increased their number of data centers since FDCCI launched in 2010. Only 6 percent gave their agency an “A” for consolidation efforts against FDCCI’s 2015 deadline.

According to the report, key barriers to consolidation – including overall resistance, data management challenges, and data growth – are preventing data center optimization and actually driving copy data growth, resulting in increased storage costs.

Federal IT managers noted that managing data growth and consolidating data centers are top priorities for next year – obvious synergies exist. The study found that agencies don’t necessarily have too many servers or too much space – they have too many systems creating redundant copies of data for multiple purposes. On average, more than one in four agencies utilize 50 to 88 percent of agency data storage to store copy or non-primary data – and storing these copies is costly. In fact, 27 percent of the average agency’s storage budget went toward non-primary data in 2013, and this year, agencies expect that number to grow to 31 percent. In hard dollars, that translates to a $2.7 billion cost in 2014, a $3.1 billion cost in 2015, and as much as $16.5 billion over the next ten years.

A growing number of applications and multiple data owners are propelling growth in the number of data copies – and one in three agencies admit that they do not vary the number of copies based on an original copy’s significance or the likelihood that it will be used again. In fact, 40 percent of Federal data assets exist four or more times.

Ironically, when asked for the top pain points associated with copy management, respondents listed regulatory requirements, culture challenges, and storage shortfalls – all ahead of data growth.

“We’ve seen the dramatic impact of a more holistic approach to copy data management in the private sector for years now,” said Ash Ashutosh, Founder and CEO of Actifio. “Frankly I’m not surprised by the magnitude of the potential savings at the Federal level, or that this has now come to light as a significant barrier to FDCCI. Copy data virtualization is today where server virtualization was 10 years ago. We’re thrilled it’s now been identified as a strategy that can dramatically accelerate the process of data center consolidation, and get FDCCI back on track.”

The majority of survey respondents said that better management of copy data will help make their agency’s consolidation efforts under FDCCI successful, though just 9 percent of agencies have implemented projects to better manage storage and data growth today. As agencies work toward the FDCCI deadline, and ultimately, a transition to the cloud, they must shift the discussion of FDCCI from server virtualization to enhanced data management and virtualization.

“With the public flogging that is healthcare.gov, agencies’ IT departments have a siege mentality,” said Steve O’Keeffe, founder of MeriTalk. “Leaders like Terry Halverson, the new CIO for the Department of Defense, are showing real leadership – going at the root causes for today’s Federal IT malaise. Data and application sprawl are the enemies of government IT efficiency. We need leadership to empower Federal IT innovators to change the failing equation. We need a cultural and acquisition shift to enable new models and the shared services that will unlock new efficiencies and real savings.”

“Consolidation Aggravation” is based on an online survey of 150 Federal IT managers in May 2014. The report has a margin of error of +/- 7.97 percent at a 95 percent confidence level. To download the full study, please visit www.meritalk.com/copydata.

About MeriTalk

The voice of tomorrow’s government today, MeriTalk is a public-private partnership focused on improving the outcomes of government IT. Focusing on government’s hot-button issues, MeriTalk hosts Big Data Exchange, Cloud Computing Exchange, Cyber Security Exchange, and Data Center Exchange – platforms dedicated to supporting public-private dialogue and collaboration. MeriTalk connects with an audience of 85,000 government community contacts. For more information, visit www.meritalk.com or follow us on Twitter, @meritalk. MeriTalk is a 300Brand organization.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it m...
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.