Welcome!

Blog Feed Post

Challenges with APM 1.0 products

I met several customers in the past few weeks who are evaluating Application Performance Management (APM) solution. They are facing a lot of challenges with their existing investments in old generation of APM solution. In this blog, I will outline some of the shortcomings with APM 1.0 tools that make them unfit for today’s applications.

What is APM 1.0

Customers have been managing application performance since early days of mainframe evolution. However, Application Performance Management as a discipline has gained popularity in the past decade.

 

Let me first introduce what I mean by APM 1.0. The enterprise applications and technologies such as Java have evolved in past two decades. The APM 1.0 tools were invented more than a decade back and they provided great benefits to resolve application issues that were prevalent with the early versions of Java and .NET technologies. However Java/.NET application servers have become mature and do not have those challenges any more. Also enterprise application architecture and technologies have changed drastically and the APM 1.0 tools have not kept up. The following figure shows the evolution of enterprise Java in the past 15 years and when APM 1.0 and APM 2.0 tools have started emerging.

 

image001.png

Following are few challenges with the APM 1.0 tools that you will run into when trying to manage your enterprise applications.

 

Challenge 1: Not enough focus on end-user or visibility for business critical transactions

 

The application owner and the application support team primarily cares about the user experience and service level delivered by their applications. APM 1.0 tools were primarily built to monitor applications from an application infrastructure perspective.

 

These tools lack the capabilities to monitor applications from real user perspective and help you isolate application issues whether it is caused by the network, load balancers, ADNs such as Akamai, or the application, database, etc. Some of these solutions were quick to add some basic end-user monitoring capabilities such as synthetic monitoring. However an application support personnel has to switch between multiple consoles and depend on manual correlation between end-user monitoring and application deep dive monitoring tools.

These tools do not allow you to track a real user request to the line of the code. That means you are blind-sighted when users are impacted and struggle to find what is causing the application failure.

 

Challenge 2: Built for Development and not suitable for production monitoring

APM 1.0 deep-dive monitoring tools were primarily built to diagnose issues during the application development lifecycle. These tools morphed into production deep-dive monitoring tools when the need arose for APM in production environments.  So, These tools were not optimized for production monitoring and hence require a lot of effort to tune for production.

 

First off, the complexities of agent installation and configuration hinder deployment in production environment.  Second, these tools usually require configuration changes every time new application code is rolled out.

 

Most damagingly, they have high overhead on application performance and do not scale beyond 100-150 application servers. This means that most customers use these in a test environment or enable deep-dive monitoring retroactively after an application failure -  assuming the problem will recur.

 

Finally, these tools do not provide operation friendly UIs and because they were originally built for developers.

 

Challenge 3: High Cost of Ownership

 

As I alluded earlier, the old generation APM tools are very complex to configure because these require application knowledge, manual instrumentation and complex agent deployment. Hence expensive consultants are required to deploy and configure and maintain these tools. These tools also have multiple consoles - adding to total cost of ownership. Some customers told me that they spend a lot of time managing these APM tools rather than being able to manage their applications.

 

Conclusion: A Poor fit for today’s applications

These tools were built more than a decade back, and have not evolved much although the application architecture, technologies and methodologies have gone though drastic changes.

 

Many of the customers whom I met were of the opinion that they spend more time managing the APM solution then managing their applications. If you use any of the APM 1.0 tools, and try to manage a modern application, you are likely in the same boat. Here are some customer expectations for a modern  APM solution:

  • It reduces your MTTR by quickly pinpointing business-critical issues with always-on, user-centric, deep application visibility
  • Non-Invasive solution that requires no changes to application code, does not require manual instrumentation and auto-discovers your transactions, frame works, etc
  • It provides Quick Time to Value and Ease of use with a single, integrated APM console
  • Purpose-built for cloud applications

 

APM 1.0 tools certainly cannot satisfy these needs.  In the next blog, I will discuss how an APM 2.0 solution like BMC Application Management addresses the challenges with APM 1.0 products and help you manage applications better thus improving customer satisfaction and resulting in better bottomline.

Read the original blog entry...

More Stories By Debu Panda

Debu Panda is a Director of Product Management at Oracle Corporation. He is lead author of the EJB 3 in Action (Manning Publications) and Middleware Management (Packt). He has more than 20 years of experience in the IT industry and has published numerous articles on enterprise Java technologies and has presented at many conferences. Debu maintains an active blog on enterprise Java at http://debupanda.blogspot.com.

Latest Stories
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists loo...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...