Welcome!

Blog Feed Post

Suppress Your Data!

Avoiding Noise in Incident Management


Suppression. According to the thesaurus, this word is synonymous with terms like deletion, elimination, and annihilation.

Yet within the context of incident management, suppression means something quite different. It’s not about getting rid of data forever. It serves instead as a way of making sure that admins focus on the right alerts at the right time by mitigating noise.

Here’s a look at how suppression significantly helps streamline incident management.

Why Suppression is Important

Why is suppression useful in incident management? Simply put, it’s because modern infrastructure generates a huge volume of alerts and admins can’t reasonably expect to be able to review each and every alert. If they try, they will soon become subject to alert fatigue, which means they will begin ignoring potentially important alerts because they are overwhelmed and burned out. And if they stop paying attention to alerts, then the entire incident management process breaks down.https://www.pagerduty.com/wp-content/uploads/2016/11/suppression-300x175... 300w, https://www.pagerduty.com/wp-content/uploads/2016/11/suppression-250x145... 250w, https://www.pagerduty.com/wp-content/uploads/2016/11/suppression-180x105... 180w" sizes="(max-width: 500px) 100vw, 500px" />

Alert suppression is a way of avoiding this issue. By suppressing alerts of certain types, admins can ensure that actionable, high-priority alerts receive the greatest attention. They can also reduce the overall number of alerts that appear on their dashboards, which helps to prevent the risk of alert fatigue.

As an example, consider an organization whose workstations reboot once a week overnight after updates are installed. The reboot would generate a series of alerts as workstations go offline and come back up. Adding these to the incidents dashboard that admins see wouldn’t be helpful, because the alerts in this case reflect a routine procedural event that does not require action. In order to avoid adding this unhelpful noise to admins’ dashboards, admins can configure their incident management software to suppress alerts related to a workstation rebooting.

Suppression: Not an Either/Or Proposition

An important point to understand about alert suppression is that suppressing alerts is not an either/or proposition. In other words, admins’ options are not limited simply to enabling all alerts of a certain type or permanently suppressing all of them.

They can instead take a more nuanced approach to suppression. Alert suppression could be configured in such a way that alerts of a given type are suppressed unless they occur repeatedly within a certain period of time, for example. Alerts could also be configured so that they are reported if they occur during a certain time of day, but are suppressed during other times. Similarly, admins might want to suppress alerts of a particular type if they occur on a certain kind of device, but not others.

This flexibility is important because it ensures that admins can maximize the effectiveness of alerts. Instead of applying broad, blunt suppression policies, they can tweak suppression settings in order to maximize the visibility of important events without adding unnecessary noise to the incident management system.

Nuanced suppression could be helpful in the example above. As I noted, admins generally don’t want to receive alerts when a workstation reboots in the middle of the night following a software update. But if the incident management software detects a workstation that reboots multiple times during the same period, that could signal a problem (like a flawed software update) that admins will want to know about. In this situation, having suppression configured so that only recurring reboots generate incidents that appear in the central dashboard, would help to optimize incident management effectiveness.

Suppression Doesn’t Mean Losing Data

It’s also worth emphasizing that suppression in the context of incident management does not mean that suppressed alerts disappear forever. On the contrary, suppressed alerts still happen, and data related to them should be saved. The only difference between a suppressed alert and a non-suppressed one is that the former is not sent to priority dashboards in the incident management system.

This is important to understand because it means that admins retain the ability to look up suppressed alerts to gain insight into an incident if they need to. This also helps them better tune their alerting thresholds. In addition, suppressed alerts still figure into historical incident management data, which can be used to reveal lots of valuable information about infrastructure efficiency and health trends.

With suppression, then, you get to have your alerts and eat them, too—or something like that.

Suppressed alerts can be leveraged in any way admins need to help identify and respond to incidents, but they don’t clutter dashboards with non-actionable information that gets in the way of resolving incidents that are likely to be of a higher priority. Moreover, suppression can be tweaked so that alerts are suppressed only under exactly the right circumstances, but are always reported so you gain full visibility into your infrastructure.

 

The post Suppress Your Data! appeared first on PagerDuty.

Read the original blog entry...

More Stories By PagerDuty Blog

PagerDuty’s operations performance platform helps companies increase reliability. By connecting people, systems and data in a single view, PagerDuty delivers visibility and actionable intelligence across global operations for effective incident resolution management. PagerDuty has over 100 platform partners, and is trusted by Fortune 500 companies and startups alike, including Microsoft, National Instruments, Electronic Arts, Adobe, Rackspace, Etsy, Square and Github.

Latest Stories
SYS-CON Events announced today that Cloud Academy named "Bronze Sponsor" of 21st International Cloud Expo which will take place October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara, CA. Cloud Academy is the industry’s most innovative, vendor-neutral cloud technology training platform. Cloud Academy provides continuous learning solutions for individuals and enterprise teams for Amazon Web Services, Microsoft Azure, Google Cloud Platform, and the most popular cloud com...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists looked at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deliver...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists loo...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...