Welcome!

Blog Feed Post

Control Costs: Keep Track of Test Environments

Test environments cost money; this is the inescapable fact of test environment management. It can be difficult to justify the expense in your conversations with finance and management, especially if they lack a software development background. Businesses are often fine with dropping millions on production infrastructure because there’s a direct line between keeping production running and revenue. On the other hand, QA environments are often viewed by management as unnecessary overhead. Why do we need all these QA environments?

I’ve even spoken with an executive who viewed increased spending on test environments as a negative. His mistaken assumption was that his company was forced to spend money on test environments because his developers wrote buggy code. It was the first time I had heard such an incorrect assumption, but I wasn’t surprised. “IT” is still a mystery to most senior executives and the idea that testing is only necessary because programmers write bad code merely reflects the inexperience this company had with software development.

For test environments, your best arguments for an increased test environment spend are quality and agility. Test environment numbers are directly related to the quality of customer experience in production. That being said, it is still an uphill battle. Test environments are overhead, and CFOs can be skeptical of the return on investment. In the next few blogs, I’m going to focus on tools you can use to reduce test environment cost. Here’s the first suggestion.

Keep Close Track of Test Environment Allocation

A poorly executed test environment management effort is a “fire-and-forget” operation. Managers allocate environments to projects, projects are left on their own to manage infrastructure, and environments are rarely reclaimed. In these organizations the test environment management team doesn’t manage environments, they simply hand out hardware to development teams without measuring any metrics. A group that just hands out hardware in an IT department is akin to a free candy store; everyone will show up asking for an unlimited number of “test environments.”

This model leads to inefficiency. Organizations not keeping a close eye on test environment allocation fail to understand which environments are being used. When a TEM just hands out capacity and infrastructure to development groups, there is no economic or market pressure to use environments efficiently. Teams just ask for environments and sit on them, even if they aren’t being used. Have you used this test environment in the last year? I’m not sure.”

That’s not a Test Environment, That’s a Secret Project

So high-profile projects might end up with twelve QA environments, all requested under the pretense that this project needed to support twelve simultaneous development streams. This is usually not the case. Even the largest, most complex projects in the industry can only successfully build on three parallel development tracks for version N, N+1, and (maybe) N+2. But, this sort of parallel development is full of risks and coordination challenges. If a team has twelve QA environments, in reality, this team likely uses QA environments to support personal development branches and other systems unrelated to QA testing. For example, one of our clients handed out thirteen QA environments to a single project, only to realize that the project had repurposed some of this infrastructure to another internal project that was being developed “off the books.” What was intended to be used as QA infrastructure was repurposed as CI/CD infrastructure for a secret project they were going to propose for next year’s budget. In this way, the test environment management budget was being used to support another department’s secret infrastructure.

If your organization is large enough, and you start using Plutora to keep track of test environments, there’s a good chance you’ll find entire racks full of hardware being used to support “testing.” But in reality, they are being used to support non-standard build pipelines, custom developer environments, and other uses that aren’t properly tracked. (And, some of these systems may be necessary, but they shouldn’t be tracked against your test environment budget.)

Fail to Keep Track of Test Environments: Pay a Penalty

Organizations that don’t use Plutora to keep track of test environment use are often paying many times what they should. They experience a lack of transparency that encourages teams to hoard test infrastructure, and they create the possibility of a “dark budget” that departments can drive entire projects through without being accountable.

If you are attempting to reduce your overall test environment management costs, one of the first steps should be auditing your test environments and finding out which ones aren’t used. You may be surprised to find that your organization is holding onto test environments used to qualify long-retired software systems. Or, you may be surprised to find testing environments that have been repurposed by teams for uses far outside the scope of software testing.

Put simply, one of the most effective ways to keep test environment costs down is to identify wasteful environments and reclaim them using Plutora.

The post Control Costs: Keep Track of Test Environments appeared first on Plutora.

Read the original blog entry...

More Stories By Plutora Blog

Plutora provides Enterprise Release and Test Environment Management SaaS solutions aligning process, technology, and information to solve release orchestration challenges for the enterprise.

Plutora’s SaaS solution enables organizations to model release management and test environment management activities as a bridge between agile project teams and an enterprise’s ITSM initiatives. Using Plutora, you can orchestrate parallel releases from several independent DevOps groups all while giving your executives as well as change management specialists insight into overall risk.

Supporting the largest releases for the largest organizations throughout North America, EMEA, and Asia Pacific, Plutora provides proof that large companies can adopt DevOps while managing the risks that come with wider adoption of self-service and agile software development in the enterprise. Aligning process, technology, and information to solve increasingly complex release orchestration challenges, this Gartner “Cool Vendor in IT DevOps” upgrades the enterprise release management from spreadsheets, meetings, and email to an integrated dashboard giving release managers insight and control over large software releases.

Latest Stories
Regardless of what business you’re in, it’s increasingly a software-driven business. Consumers’ rising expectations for connected digital and physical experiences are driving what some are calling the "Customer Experience Challenge.” In his session at @DevOpsSummit at 20th Cloud Expo, Marco Morales, Director of Global Solutions at CollabNet, will discuss how organizations are increasingly adopting a discipline of Value Stream Mapping to ensure that the software they are producing is poised to ...
When NSA's digital armory was leaked, it was only a matter of time before the code was morphed into a ransom seeking worm. This talk, designed for C-level attendees, demonstrates a Live Hack of a virtual environment to show the ease in which any average user can leverage these tools and infiltrate their network environment. This session will include an overview of the Shadbrokers NSA leak situation.
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @ThingsExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
Interested in leveling up on your Cloud Foundry skills? Join IBM for Cloud Foundry Days on June 7 at Cloud Expo New York at the Javits Center in New York City. Cloud Foundry Days is a free half day educational conference and networking event. Come find out why Cloud Foundry is the industry's fastest-growing and most adopted cloud application platform.
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, will motivate why realizing the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insigh...
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, pane...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
As enterprise cloud becomes the norm, businesses and government programs must address compounded regulatory compliance related to data privacy and information protection. The most recent, Controlled Unclassified Information and the EU’s GDPR have board level implications and companies still struggle with demonstrating due diligence. Developers and DevOps leaders, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by in...
The 21st International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...
SYS-CON Events announced today that DivvyCloud will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. DivvyCloud software enables organizations to achieve their cloud computing goals by simplifying and automating security, compliance and cost optimization of public and private cloud infrastructure. Using DivvyCloud, customers can leverage programmatic Bots to identify and remediate common cloud problems in rea...
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
SYS-CON Events announced today that Tintri, Inc, a leading provider of enterprise cloud infrastructure, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Tintri offers an enterprise cloud platform built with public cloud-like web services and RESTful APIs. Organizations use Tintri all-flash storage with scale-out and automation as a foundation for their own clouds – to build agile development environments...
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...