Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, SDN Journal, @DevOpsSummit

@CloudExpo: Article

How to Choose Where Your Load Should Come From

The goal of a load test is to replicate the traffic & conditions your app experiences in production as realistically as possible

The goal of a load test is to replicate the traffic & conditions your app experiences in production as realistically as possible

As a tester, you understand how important it is to create the most realistic load test possible to provide confidence that your web application won't fail in the field. But how do you know where your load should come from to produce realistic results?

In this article, we have outlined three options most organizations can choose from when determining where your load should come from.

  1. Generate load on dedicated infrastructure in the datacenter
  2. Generate load from a private cloud
  3. Generate load from distributed public cloud locations

There is a time and a place for all three of these options. In this post we will go over when you should do each test and why.

1. Generate load on dedicated infrastructure in the datacenter

When: Perform internal load generation for early stage performance tests and tests on internal IT apps that will be accessed by users behind your firewall.

Why: Load testing from dedicated infrastructure inside your own datacenter is the most common and typically the most accessible way to wring out performance issues in your applications. This type of testing should be performed as part of your regular testing process.

Essentially what happens is you define specific load and performance tests that push on the individual modules and user paths within the system, and make sure that no unintended bottlenecks were introduced during development. The kinds of issues you'll work out with this localized, constrained testing include:

  • Inadequate transaction design
  • Broken links
  • Poor configuration
  • Memory leaks
  • Unoptimized code
  • Sub-optimal session model
  • Ineffective caching
  • Database indexing, connection, and concurrency issues

While performing this type of load test on dedicated equipment is easy, there are a few issues with using this approach as your sole means of load testing - the first of which is that many companies simply do not have adequate dedicated infrastructure resources to realistically test application performance at scale. The capital expense involved with acquiring and managing network equipment, servers, and applications for the sole purpose of performance testing can be hard to justify - especially when you consider that this infrastructure may sit idle for a significant portion of its life.

Second, load testing from anywhere inside your own datacenter environment will test only application performance. It will not test the full capabilities of the datacenter (including network configurations, load balancing, or security systems) and therefore will not produce a realistic picture of the user experience.

Running load tests from your internal environment is an important part of improving application performance, but it is not sufficient. You should integrate tests like those outlined above as a frequent part of your build & integration process. As you near the end of a release cycle, you'll have confidence that the application's functionality performs well under periods of high traffic.

2. Generate load from within a private cloud

When: Perform tests from a private cloud when you need to ramp up your load generation infrastructure quickly for larger tests.

Why: To address some of the issues above, many organizations are running load tests from within their private clouds. Operating on this type of infrastructure has several benefits when compared with the first approach that has been outlined.

Functionally, this method of testing is best suited to the same types of performance tests that you would execute using dedicated hardware: database optimization, transaction design, and the rest. However, private clouds address some of the problems with running on dedicated hardware.

First, a private cloud can be significantly less expensive to operate, specifically because the cloud infrastructure can be leveraged across many different applications. When you are done with your load test, virtual machines can be reassigned to complete other tasks such as analytics, development, or other cloud-friendly business processes. This allows an IT organization to distribute infrastructure costs, thus lowering the overall expense.

Second, this approach lends itself very easily to more realistic scaling. You can grab more cycles and more disk space as needed, ultimately driving a more aggressive stress test of the application.

Chances are, however, since you are testing from within your own network and environment, you are still not stressing the complete datacenter. As a result, this type of load generation approach won't necessarily give you an accurate picture of what your users will experience in periods of high traffic.

3. Generate load from distributed public cloud locations

When: Perform this test for when your app will be accessed primarily from users outside your firewall from distributed locations.

Why: By running your load tests from outside the datacenter in a public cloud or even a remote private cloud environment, you will get a good picture of how your datacenter infrastructure handles high traffic. This third type of load will test a part of the application's delivery that no amount of testing from your own datacenter can duplicate. This includes:

  • Load balancers
  • Networking equipment configuration
  • Data center connectivity and bandwidth
  • Web firewalls and security systems
  • In-network caching and content distribution
  • Network architecture
  • DNS, web latency, network congestion

It's important to focus your testing so it puts the load in the right place. For example, load testing firewall or load-balancing infrastructure. You may even choose to load test multiple applications that exist within the same data center at the same time to see if it chokes up for any reason when the load is applied.

Another major point that many testers don't often consider is the fact users are accessing your app from many different regions across different service providers. To get a realistic picture of the actual user experience, it is important to see what their experience looks like from their access points anywhere in the world. This will give you a better understanding of what happens to your web application when users from different parts of the world are flooding your website with traffic.

Prevent a Disaster, Don't Cause One
The ultimate goal of a load test is to replicate the traffic and conditions your app experiences in production as realistically as possible to give you confidence that your site will not fail at peak times with actual live users. By looking at the three major options of where your load should come from, it will give you a better idea of how your app will perform under stress.

No matter what, each of these tests will catch something different in your application. And we all should know by now, if you catch a bug earlier in the process you won't have to pull out the big bucks to fix a major disaster down the road. Prevent a disaster, don't cause one.

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, discussed new ways of thinking and the approaches needed to address the emerging challenges of security i...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, will discuss th...
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, discussed the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insights from the vast t...
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, discussed solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool. H...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...