Welcome!

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Computing: How It Affects Enterprise and Performance Monitoring

New Technology, Same Old Problems

In recent times, cloud computing has played a dominant role in the industry. Whether you feel positively or negatively about it, it is undeniable that cloud monitoring, like any other component in your network, needs to be monitored – perhaps more than any other. To more old-fashioned solutions for monitoring, the cloud creates a number of obstacles: you do not have ownership of its hardware, it is not run on your network and when problems or glitches occur you have no control over them.



Today there is a wide range of utilities to help you to manage your cloud computing, and the majority of these are able to respond to the disappearance of instances by starting up new ones with just a little direction from you. But how can this be integrated into your existing monitoring?

More old-fashioned tools often need lengthy and difficult alterations to their settings, as well as total reboots of the service for these alterations to be acknowledged. It is not uncommon for an organization to restart and close down upward of 100 instances per day, equivalent to a brand-new instance every 5 minutes. This constant rebooting of the monitoring tools is unsustainable and means that systems may not have sufficient time to run necessary tests, making monitoring data less valuable. This wipes out much of the benefit that cloud computing brings to a company which is reliant upon the health of its IT infrastructure for the health of its business processes.

The tools you employ to keep track of the cloud instances must be as adaptable and customizable as the instance itself. Your systems should be able to respond to minute-by-minute alterations without rebooting. Additionally, you should remove the need for manual alterations: assessing the system too frequently will cause stress. Ideally, your system should offer a capable application programming interface with configuration of monitoring built into its central management.

As for moving information to the system from the cloud, the benefit of the cloud is additional processing without additional computing. A heavyweight agent will slow your applications or increase your cost per instance for data collection. A lighter system with an agent that can be tailored to your needs is better.

Data Exposure
Many service providers and apps expose data to allow it to be extracted remotely without an agent, in the form of a user-friendly REST application programming interface which pulls data using JSON or XML. This means that pulling your data has a minimal effect on your systems. When an agent is required to examine non-exposed or non-viewable data, you will need the option to run an agent or to create a script for data exposure – whichever suits your needs. What is sometimes unexamined is that cloud-mined data should be handled in the same manner as another data point. The data offered by cloud providers may not provide useful insight into your performance for your infrastructure.

Regardless of whether you operate via cloud computing or not, if your systems monitoring does not mine data according to your needs, you need to change it. Data enables better feedback and better performance, providing context to decisions and ensuring that goals are being achieved. It allows IT systems to be optimized for business.

For some organizations, cloud computing has disrupted business practice, which cloud utilities paving the way for the future. They force a rethink of administrative processes – you can examine your existing tools and discard or refine them for new tasks.

Cloud computing has prompted a great business evolution. However, merely slapping a SaaS interface onto ancient coding and declaring it 'the cloud' is not enough to achieve your goals. Infrastructure is vital – utilities and processes built specially for the cloud platform will bring flexibility and the ability to adapt to changes within and without. It is survival of the fittest. Minute-by-minute monitoring of systems brings up-to-the-moment responsiveness, so that your company will thrive in the brave new world of cloud computing.

More Stories By Anne Lee

Anne Lee is a freelance technology journalist, a wife and a mother of two.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
StarNet Communications Corp has announced the addition of three Secure Remote Desktop modules to its flagship X-Win32 PC X server. The new modules enable X-Win32 to safely tunnel the remote desktops from Linux and Unix servers to the user’s PC over encrypted SSH. Traditionally, users of PC X servers deploy the XDMCP protocol to display remote desktop environments such as the Gnome and KDE desktops on Linux servers and the CDE environment on Solaris Unix machines. XDMCP is used primarily on comp...
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...