Welcome!

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Agile Computing, @CloudExpo, Cloud Security

@DevOpsSummit: Blog Feed Post

Micro-Architectures Need Relational, Application-Driven Monitoring

Is your monitoring strategy evolving along with your application and infrastructure architectures?

As "applications" continue to morph into what we once might have called "mashups" but no longer do because, well, SOA is officially dead, dontcha know, it is increasingly important for a variety of constituents within organizations - from business stakeholders to application owners to devops - to understand the overall "health" of an application.

Traditional monitoring techniques focus on monitoring from a very infrastructure point of view. That is, the technique is really more of a pool and resource monitor than it is an application monitor. Each individual service that comprises an application are monitored individually, with no real view of how the "application" itself is performing.

traditional-monitoring

Now the problem with this approach is that different applications may share the same services (especially in an API-driven model) but have very different performance and availability requirements. It may be completely acceptable for an internal application to respond more slowly than a consumer-facing application, for example.

Thus organizations are left with a view that accurately informs them as to the current health of individual services, but no real way to use them to get a picture of how the application is performing.

Application-Driven Monitoring
What we really need is to be able to not only monitor the performance and health of individual services but the concept of an application - even if that application is just a mashup of other applications or services.

modern-monitoring

Important to remember, too, is that applications aren't limited to a single protocol, like HTTP. Consider an application like Microsoft Exchange, which can be - and frequently is - accessed via multiple protocols. It may be necessary to monitor a variety of services in order to determine the actual health and availability of the application.

app-health-scoreThe key is to not just monitor individual services (that's important, but it's not the whole enchilada) but also the application as a whole. This provides the business and application stakeholders with a better view of how IT is servicing their needs and also offers IT significant value in understanding the impact of individual services on application and business services.

For example, if the same service is used for multiple applications and the service starts degrading, it should (logically) impact the health of every application. Noticing this early on enables IT to proactively deal with the situation, up to and including notifying all the application owners that there's an issue with a core service and IT is already on the case, before the call comes in. Being able to further monitor and analyze performance across time enables the identification of outliers earlier. By spotting these leading indicators of trouble, it can be possible to head off an outage or performance degradation before it occurs, leaving application and business stakeholders blissfully ignorant of what might have been a disastrous incident.

It can also be the case that sudden demand for an application negatively impacts the performance or availability of a shared service, which in turn, of course, impacts applications that use that service. By monitoring all the pieces of the application, the source of increased demand can be more easily correlated and a strategy to address it formulated.

Monitoring is a critical (and sadly often overlooked and underappreciated) function in the data center. Without it, however, modern methods of scalability (elasticity) and orchestrated responses to failure would not be possible. Because it is so critical, it's important to ensure that monitoring capabilities and your use of them is supporting modern architectures, networks and services.

Without monitoring, there's really no way to recognize and react to failures, overloads, and outages. So make sure your monitoring strategy is evolving along with your data center infrastructure and applications.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, discussed solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool. H...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
"We work around really protecting the confidentiality of information, and by doing so we've developed implementations of encryption through a patented process that is known as superencipherment," explained Richard Blech, CEO of Secure Channels Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
The Founder of NostaLab and a member of the Google Health Advisory Board, John is a unique combination of strategic thinker, marketer and entrepreneur. His career was built on the "science of advertising" combining strategy, creativity and marketing for industry-leading results. Combined with his ability to communicate complicated scientific concepts in a way that consumers and scientists alike can appreciate, John is a sought-after speaker for conferences on the forefront of healthcare science,...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abilit...