Welcome!

News Feed Item

Veeam DataLabs™ Provides Isolated Instances of Any Production Environment for DevTest, DevOps and DevSecOps to Accelerate Innovation and Reduce Risk

VeeamON -- Veeam® Software, the leader in Intelligent Data Management for the Hyper-Available Enterprise™, today unveiled a new, key component of Veeam Hyper-Availability Platform - Veeam DataLabs™. A form of copy data management, Veeam DataLabs allows organizations to easily create new instances of their production environments on-demand. This capability enables use cases beyond typical data protection scenarios, such as DevTest, DevOps, and DevSecOps, and including security and forensics testing, and on-demand sandboxes for IT operations – delivering value to the business in the form of accelerating innovation, improving operational efficiencies, reducing risk and optimizing resources.

Veeam DataLabs takes the functionality of Veeam’s Virtual Labs—enabling production-like instances of virtual environments on-demand—and expands on it with additional use cases and business value. These isolated “sandboxes” leverage existing data to accelerate innovation and to reduce risk. Veeam DataLabs takes the core components of backup and replication from being a reactive insurance policy to a proactive value-added service for the enterprise. Efficiencies are gained by capturing data once by way of the data protection process, and then repurposing that data on demand for new use cases, potentially by new users, to unlock the potential value of that data.

“Managing data is the Achilles heel of IT when it comes to delivering what end customers demand,” says Danny Allan Vice President, Product Strategy at Veeam. “However, in delivering on these demands, enterprises are under pressure to rapidly accelerate innovation while ensuring zero impact on the user’s experience. These rarely go hand-in-hand. With Veeam DataLabs, you can mitigate the risks associated with application deployment and configuration changes by testing them in a secure and trusted, production-like environment. Customers can accelerate innovation and time-to-market by leveraging an isolated, virtual sandbox for rapid application development and testing.”

Veeam DataLabs enables a self-service method for developers to dynamically spin up instances of the production environment as they design new features. This provides a way to accelerate the pace of digital service delivery and ensures that teams are developing and testing with the most recent copies of data. Veeam DataLabs also provides sandbox environments for IT Operations to test new patches and updates before they are rolled out across the company. The Security and Forensics team may use copies of the data to test for security vulnerabilities without disrupting the production systems, or for performing forensics on an event that was picked up through their security incident and event management platform. Finally, Compliance and Analysis groups can examine and classify data for capacity planning purposes, and to help comply with regulations such as GDPR. Veeam DataLabs enables all of these scenarios without disrupting production systems or requiring more infrastructure.

“Veeam DataLabs with On-Demand Sandbox capabilities allows us to rapidly re-create our production environment in a totally isolated instance so that we can perform non-disruptive DR testing and prove that we can meet application availability SLAs,” said Jeff Martinson, Director of Information Technology at Ameritas. “With these features, we are much more confident in our ability to exceed audit requirements through systemic demonstration of success.”

In addition, Veeam DataLabs leverages the Universal Storage APIs to integrate with Veeam’s storage partners. The same snapshot integrations enabled through these APIs, which also accelerate data protection capabilities, can be leveraged with storage partners like Cisco, HPE, IBM, NetApp, Pure Storage and others to provide an even more efficient way to deliver this added value.

“Veeam Hyper-Availability Platform is the only way to enable faster reaction times to any business need, reach multifold improvements in efficiencies and deliver the agility to meet evolving customer demands; Veeam DataLabs is a perfect example of how we are expanding our data management capabilities, going beyond Availability and innovating to meet the needs of the Hyper-Available Enterprise,” added Allan.

For more information on Veeam’s vision for the Hyper-Available Enterprise, please visit www.veeam.com.

About Veeam Software

Veeam is the global leader in Intelligent Data Management for the Hyper-Available Enterprise. Veeam Hyper-Availability Platform is the most complete solution to help customers on the journey to automating data management and ensuring the Hyper-Availability of data. We have more than 300,000 customers worldwide, including 75 percent of the Fortune 500 and 58 percent of the Global 2000. Our customer satisfaction scores, at 3.5X the industry average, are the highest in the industry. Our global ecosystem includes 55,000 channel partners; Cisco, HPE, and NetApp as exclusive resellers; and nearly 19,000 cloud and service providers. Headquartered in Baar, Switzerland, Veeam has offices in more than 30 countries. To learn more, visit https://www.veeam.com or follow Veeam on Twitter @veeam.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
"Our strategy is to focus on the hyperscale providers - AWS, Azure, and Google. Over the last year we saw that a lot of developers need to learn how to do their job in the cloud and we see this DevOps movement that we are catering to with our content," stated Alessandro Fasan, Head of Global Sales at Cloud Academy, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with bu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...