Welcome!

Blog Feed Post

Control Costs: Measure Test Environment Effort

How many people does it take to set up and tear down a test environment? If you ask this question and measure the effort required you might be surprised by the result. While organizations often focus on the hardware budget for test environments, the largest cost associated with test environment management is almost always the labor required to configure and reconfigure test environment systems. It often takes teams of two to three people a day to configure an environment that hasn’t been automated, and test environments can keep teams of valuable resources like DBAs busy performing repetitive tasks such as executing schema changes and loading test data.

If you are sharing test environments to save money, you should measure the effort required to set up and tear down environments between projects. The cost savings achieved by sharing environments is likely small when compared to the cost of labor required to reconfigure environments between projects. People are more expensive than virtual machines.

Manual Test Environment Effort is Time Wasted

The biggest cost associated with manual test environment management is the opportunity cost of having critical resources focused on manual set up and tear down activities. Measuring the effort required in test environment management is the first step to justifying initiatives to automate test environment set up and invest in cloud-based test environment infrastructure. Use Plutora to measure the effort required to maintain, set up, and tear down test environments, and identify opportunities for automation.

If you capture the true cost of manual test environment management activities, it will be easier to pitch to management about an automation initiative. Even if it requires a short-term increase in your TEM budget, the long-term benefits of automating test environments are overwhelming. Freeing up teams of test environment engineers and DBAs to focus on more important problems of higher value will yield immediate benefits, and with automated test environments, your team will breeze through QA instead of having to wait for manual environment work.

“How long does it take to get a test Environment?”

If you’ve ever had to tell a project manager or an engineering manager to wait a few weeks for a test environment, you’ll understand how difficult it can be to convey that message. In an era of instant automation and continuous deployment pipelines, your internal customers have a completely different set of expectations. “Why does it take so long to get a QA environment?” is one of the most common questions in IT.

And you are competing with public cloud providers. Developers are familiar with public cloud APIs, and many have had experience working in companies that were able to spin up and tear down environments in minutes. These developers might not fully appreciate the constraints present in a larger organization that hasn’t fully embraced public cloud infrastructure. When you talk of security constraints and capital expenditure requests for hardware, these internal customers are just wondering, “Why can’t I have my QA environment quickly?”

Stop Wondering, Start Measuring

All too often companies just ignore the work effort required to bring a test environment online. Unless your organization has completely embraced cloud-based infrastructure and infrastructure automation, it can take days or sometimes weeks to get everything aligned for a new QA environment. Development teams are often driving environment managers to create more and more environments to support parallel development, and it can be a constant challenge to communicate the amount of effort required.

Developers and development managers often discount the work of test environment management and wonder why it takes so long to stand up new test environment infrastructure. When this happens, it is because test environment managers have failed to capture the true costs associated with standing up a new environment. Plutora gives you the opportunity to start measuring the effort required, and it provides you with a tool that can make this cost visible not only to your internal customers but to management as well.

To fix the perception issue created by a slow-moving TEM effort and to encourage a more rational approach to staffing test environment management, it is the responsibility of test environment managers to use tools that allow them to capture the effort required to maintain and set up test environments. Plutora is just such a tool, and we have built features into Plutora that allow test environment managers to track the effort required and run reports that can inform planning around the staff and capacity allocation.

Just measuring your costs can go a long way towards controlling them, and Plutora can help you do just that.

The post Control Costs: Measure Test Environment Effort appeared first on Plutora.

Read the original blog entry...

More Stories By Plutora Blog

Plutora provides Enterprise Release and Test Environment Management SaaS solutions aligning process, technology, and information to solve release orchestration challenges for the enterprise.

Plutora’s SaaS solution enables organizations to model release management and test environment management activities as a bridge between agile project teams and an enterprise’s ITSM initiatives. Using Plutora, you can orchestrate parallel releases from several independent DevOps groups all while giving your executives as well as change management specialists insight into overall risk.

Supporting the largest releases for the largest organizations throughout North America, EMEA, and Asia Pacific, Plutora provides proof that large companies can adopt DevOps while managing the risks that come with wider adoption of self-service and agile software development in the enterprise. Aligning process, technology, and information to solve increasingly complex release orchestration challenges, this Gartner “Cool Vendor in IT DevOps” upgrades the enterprise release management from spreadsheets, meetings, and email to an integrated dashboard giving release managers insight and control over large software releases.

Latest Stories
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities – ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups. As a result, many firms employ new business models that place enormous impor...
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
SYS-CON Events announced today that TidalScale, a leading provider of systems and services, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale has been involved in shaping the computing landscape. They've designed, developed and deployed some of the most important and successful systems and services in the history of the computing industry - internet, Ethernet, operating s...
SYS-CON Events announced today that Massive Networks, that helps your business operate seamlessly with fast, reliable, and secure internet and network solutions, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. As a premier telecommunications provider, Massive Networks is headquartered out of Louisville, Colorado. With years of experience under their belt, their team of...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launchi...
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...