Welcome!

Related Topics: @DevOpsSummit, Java IoT, Microsoft Cloud, Linux Containers

@DevOpsSummit: Blog Post

Choosing the #APM System that Is Right for You | @DevOpsSummit [#DevOps]

A lot of the arguing in the APM space is about the fundamental approach to monitoring application transactions

In my role as technology evangelist I spend a lot of time helping organizations, big and small, make their IT systems better, faster and more resilient to faults in order to support their business operations and objectives. I always find it frustrating to "argue" with our competitors about what the best solution is. I honestly think that many APM tools on the market do a good job - each with advantages and disadvantages in certain use cases. There is no "one size fits all" - there is just a "this tool fits best for your APM Maturity Level" (not saying the others wouldn't do a good job).

A lot of the arguing in the APM space is about the fundamental approach to monitoring application transactions: monitor and capture ALL details vs. monitor and capture relevant details. Along with that come topics like "overhead impact", "scalability" and "data hording vs smart analytics".

Ultimately, you want to pick the right tool to solve your problems. As you have multiple tools to choose from let me - in my role as technology evangelist - highlight some of the use cases that our customers solve. As a technologist and a blogger, what I really care about is that the right technology is applied to the right problem. As such, I feel compelled to share what I have learned working with customers in the trenches. Hopefully, this will help you understand the technology and what problem it can solve in real life problems, and cut through the propaganda. Let me start with a few use cases today and follow up with some more in follow up blog posts.

Use Cases from Steven - A Performance Engineer
The first use cases are picked from Steven - whom I reached out to after I read his question on our APM Community Forum. His company decided to move from a competitor to our APM solution and I wondered why. In an email, he highlighted that he had some initial success with the tool, and had been able to solve a couple of low hanging problems. When they decided to start taking a strategic Continuous Delivery approach to software delivery, they realized that the current tool had certain shortcomings slowing their attempts to practice DevOps.

They identified the following key problems they need to solve and what they really required from an APM solution in order to get to where they are heading:

How a user got to a problem, and not just seeing the problem itself

  • Every transaction, with all details they need, out-of-the-box
  • Web request/response bytes, SQL bind values, exception details for every transaction

Number of transactions executed per user and tenant used for business and cost reporting

  • Capture custom business context data for every transaction
  • Business transactions based on "buried" context data as not every detail is in the URL

Eliminate homegrown tools which are costly to maintain

  • Provide application as well as system and infrastructure monitoring
  • Integrate with other tools such as JMeter, LoadRunner, Jenkins or HP Open View

Eliminate the need to make people look at other tools and data

  • Foster collaboration across Architect, Dev, Test & Ops by using same data set
  • Data must be shareable with a single click

Ability to extend to custom frameworks, systems and protocols

  • Bring in custom metrics from external tools via Java Plugin infrastructure
  • Follow transactions across any custom protocol or technologies outside Java & .NET

Full Automation to support Continuous Delivery

  • Use Metrics provided by APM for every build artifact along the deployment pipeline to act as quality gateway
  • Inform APM about new deployments to prevent false alerting

Replace traditional application logging

  • Eliminated log files which saves I/O and storage
  • Get the log messages captured in context of a transaction and the context of the user that triggered that log message

One solution for everything

  • Not just performance monitoring but also business reporting as well as deep dive diagnostics

Active community forum

  • Get answers right away
  • Leverage extensions already provided by the community such as plugins for Jenkins, PagerDuty, ...

Let me give you some examples for Steven's use case so that you can better decide on whether that is relevant for you as well:

Every Transaction with All Details
dynaTrace was built from the ground up to support the full software lifecycle. We as Compuware APM/dynaTrace understood that we needed a technology that captures every transaction with all details for root cause diagnostics as well as proper business monitoring without falling into a sampling mode where you lose critical information for both business and root cause diagnostics. Most of our customers claim they see little to acceptable overhead in production yet capturing 100% transactions including method arguments, SQL Statements, Log Messages or Exceptions. The magic word in our case is our PurePath (see the YouTube video) & PureStack Technology which allows dynaTrace to do exactly that. One of the several visualization of the PurePath is the Transaction Flow which is a great way to understand how your transactions flow through the system - where your hotspots are (3rd party impact, custom code issues or impact of Garbage Collection) and where your architectural issues (e.g: too many web service calls, too many SQL executions):

Transaction Flow: One View that tells it all to Devs, Architects and Operations Teams

What if you don't capture all transactions but be "smart" and focus on capturing the problematic ones? While this approach allows you to find and fix the easy-to-find problems that can be analyzed by analyzing those transactions that fail or violate the average response-time based baseline, it falls short when it comes to problems that are caused by transactions that are not "outside the norm". One example here is a database deadlock we recently analyzed for a customer. The "smart" approach only highlighted the transaction that hit the deadlock but no information was captured for those transactions actually causing the deadlock with their data manipulations. Being able to see which transactions executed which UPDATE statements at the time leading up to the deadlock is required to solve this problem.

As companies - such as Steven's - are getting into a maturity level where they grow out of "smart" average response time-based analysis it is important to have the ability to look at everything and not just the average problem. As a follow up read the blog Why Averages Suck and Percentiles are great!

Capture Custom Business Context
What is Custom Business Context? The actual business function executed such as a "Create Claim", "Transfer Money," or the name of the user or tenant of your system. Why is this not as easy as it sounds? Because many applications just don't show the business function as part of the URL or provide the user name in a cookie. A great example was given in a webinar by NJM Insurance (New Jersey Manufacturing Insurance). They were using a third-party claim management software which was designed to "hide" everything behind a claimCenter.do URL. In their case they needed dynaTrace to analyze every single transaction and pick a method argument invoked in the business layer of their app to figure out which function in their system was actually executed. On top of that they also needed to know the user that executed that function because they needed to understand which insurance office and group of employees created how many claims as they needed this for their quarterly business reports. The following shows business reporting based on the user role where the user role gets captured from a method argument within the business logic of the application:

Business Reporting requires Business Context data for every Transaction

This was only possible because dynaTrace allows you to selectively capture business context in the context of every single executed transaction. Along the PurePath you will then see things like method arguments, return values, bind values, session variables, HTTP parameters or cookie values. All to be later used for your business reporting or targeted root cause diagnostics. Here is a follow up blog post that explains business transactions in more technical detail.

For more APM Buyer's tips, and for further insight, click here for the full article.

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex software systems for startups and enterprises. Since 2009 it has grown from a small group of passionate engineers and business...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
SYS-CON Events announced today that Ayehu will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara California. Ayehu provides IT Process Automation & Orchestration solutions for IT and Security professionals to identify and resolve critical incidents and enable rapid containment, eradication, and recovery from cyber security breaches. Ayehu provides customers greater control over IT infras...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.