|By Plutora Blog||
|December 6, 2016 08:33 PM EST|
Test Environment Managers are often asked to audit everything. If there is a question about budgets they need to quickly identify opportunities to reallocate infrastructure and environments. Hardware is expensive, and resources are limited so when a business looks at the numbers and asks, “Why do we spend so much on hardware?” it means that CTOs and CIOs are asked to scrutinize spend quickly.
Before tools like Plutora were available a test environment manager would fire up Microsoft Excel and create a spreadsheet. They would ask environment engineers and development teams to fill out this spreadsheet with the resources they are using for QA and testing. As technology improved over the years some organizations have adopted tools and systems that can automate the creation of this environment inventory, but in many companies, this is still a manual process.
During this process, there are often discussions about how many environments a particular project really needs. One project may only have one QA environment while another may have four or five. Environment managers are frequently put in a position of having to ask teams to justify why they need so many environments.
There are various reasons why a team might need multiple QA and staging environments here are just a few:
Teams are working on parallel development efforts. If a project has regular releases there’s a good chance that when a development team is finished with a feature, a QA team takes over to validate that feature. During that QA process, development teams often want to move on to the next feature. In these scenarios having two QA environments make sense as features can be delayed and releases will have to be serialized if a QA environment is “tied up.”
Long-term feature releases need to be developed while short-term bug fixes are qualified and staged. If you have a team working on a series of larger, multi-month development stories to launch a new product these efforts almost always require a dedicated environment. These are QA efforts that take months, and require customizations to databases that cannot ship to production. If you don’t have an isolated system for these longer-term initiatives you will be unable to fix bugs as they are identified in a production system.
Systems that rely on services. If you develop code that relies on back-end services that are also being modified by independent development teams these teams may require multiple environments that are configured to connect to the appropriate testing service. Service-oriented architectures and microservices cause a combinatorial increase in the number of environments required to perform end-to-end testing.
Considering all of these factors is important when a test environment manager examines the current allocation of test environments. There’s no one rule to cover the number of test environment required, but there a few things to consider when assessing a project’s environment needs:
Projects sitting in front of services often need more environments – If your application fronts a number of services under active development you will have a team that requests multiple QA systems to connect to different releases and versions of these services. If you have a multi-leveled architecture with services depending on other services you’ll see an even greater number of environments required.
Projects supporting critical, customer-facing applications need to move quickly (with more environments) – Projects support back-office operations that can wait a few days to fix bugs often don’t need multiple staging or production environments. Projects that need to respond to customer-facing bugs in hours or minutes these are the projects that need maximum agility and which may require additional environments.
It’s a trade-off – Some projects will demand tens of environments to support multi-stream development projects and QA efforts to support continuous deployment. There’s a trade-off between cost and agility, and this is a tradeoff that test environment managers have to be able to calculate and communicate.
It’s easier to do that when you have a tool that keeps track of environments all the time. That tool is Plutora, and with our Test Environment Management tool, you’ll be able to see strategic allocation challenges in a single, consolidated place. You’ll never have to fire up Excel and send emails to gather this data again.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Feb. 20, 2017 10:15 PM EST Reads: 6,188
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Feb. 20, 2017 10:15 PM EST Reads: 3,311
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Feb. 20, 2017 10:15 PM EST Reads: 5,894
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Feb. 20, 2017 09:15 PM EST Reads: 893
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Feb. 20, 2017 09:15 PM EST Reads: 4,492
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Feb. 20, 2017 08:45 PM EST Reads: 4,326
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and 21st International Cloud Expo, which will take place in November in Silicon Valley, California.
Feb. 20, 2017 08:30 PM EST Reads: 1,493
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Feb. 20, 2017 07:00 PM EST Reads: 8,229
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
Feb. 20, 2017 06:15 PM EST Reads: 828
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
Feb. 20, 2017 06:15 PM EST Reads: 997
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Feb. 20, 2017 05:45 PM EST Reads: 534
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
Feb. 20, 2017 05:30 PM EST Reads: 3,913
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Feb. 20, 2017 05:30 PM EST Reads: 2,271
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, discussed the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports.
Feb. 20, 2017 05:15 PM EST Reads: 971
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Feb. 20, 2017 02:45 PM EST Reads: 508