Welcome!

Related Topics: @CloudExpo, Microservices Expo, @BigDataExpo

@CloudExpo: Article

The Benefits of Hybrid IT | @CloudExpo @SolarWinds #AWS #BigData #AI #DX

Hybrid IT is today’s reality and its implementation may seem daunting at times

The Benefits of Hybrid IT: Expectations vs. Reality

Hybrid IT is today's reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it's in every IT professional's best interest to know what to expect.

Certainly, every business has a varying expectation of what they'd like to gain by migrating applications or infrastructure to the cloud, but the most compelling incentives are by far the opportunity to achieve a simplified management process, faster deployment, or a cost efficiency that benefits their bottom line.

Unfortunately, for many organizations, achieving these benefits isn't always easy-or even the reality. As illustrated by the findings of this year's report, only one-fourth of IT professionals surveyed have received all expected benefits from migrating area(s) of their organizations' IT infrastructure to the cloud. Part of this may be because of too high of expectations for the cloud and not enough consideration prior to migration. Some realities IT professionals should consider follow.

The Cloud Isn't Cheap
IT departments around the world are being pressured by management to migrate to the cloud, as the cloud is often seen as a cheaper model when compared to hosting applications on-premises. But this isn't always the case; depending on what applications are being migrated and which additional services are selected, it could end up costing an organization much more than an on-premises deployment. By the time IT professionals architect and design their entire stack, the price will likely be higher than expected. With the vast amount of services available from companies like Amazon Web Services™ (AWS®), organizations can load up on additional features that offer a greater breadth and depth of application data, but they come at a price.

For example, one organization I'm acquainted with historically maintained their own data centers, but eventually decided to migrate some of their applications to AWS to save money. After all the additional services they added to their AWS solution, they were spending around $7 million a year, which was much more than they anticipated. The company ended up moving some of their services out of the cloud and into a co-location data center model, which cut their IT operational costs down to $2.5 million a year. This example aligns with the research and shows that although cost may be a driver, there are other important considerations-after all, 35 percent of IT professionals who migrated or attempted to migrate infrastructure to the cloud ultimately brought back or left some on-premises.

Continuous Management
What's most important to remember is that in today's on-demand environment, uptime and an acceptable end-user response time are expected no matter where the application is hosted. Application life cycles will also change, becoming shorter and requiring continuous integration and delivery. Once an application is in the cloud, the services become demand-as-a-service and need to be running at optimum levels constantly to meet SLAs.

Operational pieces of the cloud will be more than speeds and feeds. Back in the days of traditional IT, administrators would have to conduct spreadsheet exercises to plan for sizing out their data center. They had to consider current needs and determine what a few years of growth potential would look like before they could make a decision on data center size. Following the shift to the cloud, these services are on-demand, and IT professionals must now place their focus on ensuring SLAs for applications are being met, and work to keep the baseline in check.

One of the ways this can be achieved is by implementing a monitoring system that provides a comprehensive view across the entire hybrid IT environment. Such a system will allow IT staff to make informed decisions about what workloads belong on-premises or in the cloud. It will also show, at any given moment in time, when an application's performance is slowing down or underperforming, whether in the cloud or on-premises, and compare relative performance between these two to make informed decisions. By mastering hybrid IT monitoring tools, IT professionals will be able to understand of how their applications change over time, and track the actual requirements of that application and its workload.

Pre-Migration Considerations
Hybrid IT makes IT environments more complex. Now IT professionals have to face a number of moving parts that they are responsible for, many of which they don't control, but still have to manage. IT professionals may have their applications on somebody else's server and infrastructure, but once the application is there, it doesn't matter where it resides or is hosted-they're still responsible for that application's quality of service, ensuring that it is meeting the SLA.

A recent example is the Amazon® S3™ outage, which disrupted many websites and web applications that used AWS S3 region for storage. Even though organizations around the country had applications running on their servers, the IT professionals were still responsible for making sure their applications worked. This is why it's critical for IT teams to properly architect applications across various cloud services providers: it lowers the risk of downtime and poor performance. Of course, the tradeoff is that this also adds more variables and complexity into the hybrid IT environment.

IT teams should also consider their level of experience with security before migrating applications to the cloud, especially a public cloud. They should start with a non-mission critical application and graduate to migrating more critical applications. To ensure data is appropriately protected in the cloud, IT organizations must be very clear about risk mitigation when working with their cloud services provider, and take steps to deploy careful use of security monitoring and management tools.

Best Practices
With the increased migration to the cloud, IT professionals need to be armed with best practices to keep pace with the changing landscape. They should consider the following:

  • Collaboration: Not only is it vital that IT professionals stay connected with colleagues in their industry, but they also need to take advantage of the communities that are being built around new technologies. If they are able to connect and collaborate with other users and share case studies and discuss what worked best for them, they will be in a good spot to both understand and to reap all the benefits of hybrid IT.
  • Communication: IT professionals need strong communication skills. Being able to communicate successes and issues to their management team is very important. They also have to be able to clearly communicate with their cloud services providers, especially if there are applications spread across multiple locations.
  • Comprehensive Skillset: As the hybrid IT landscape continues to change and evolve, IT professionals need to develop a comprehensive skillset to cloud-proof their jobs. If IT professionals can gain the skills needed to gain visibility and troubleshoot issues that come along with the hybrid IT environment, they will have the skills needed to transition into this era of IT.

If an organization hasn't started migrating to the cloud, chances are it's coming very soon. IT professionals need to set expectations with management when it comes to hybrid IT, and communicating those expectations and realities will be vital to prove the success of any deployment. As the IT environment becomes more complex, IT professionals need to continue mastering new skillsets and cloud-proof their jobs for the future.

More Stories By Kong Yang

Kong Yang is a Head Geek at SolarWinds with over 20 years of IT experience specializing in virtualization and cloud management. He is a VMware vExpert™, Cisco® Champion, and active contributing thought leader within the virtualization community.

Yang’s industry expertise includes performance tuning and troubleshooting enterprise stacks, virtualization sizing and capacity planning best practices, community engagement, and technology evangelism. He is passionate about understanding the behavior of the entire application ecosystem — the analytics of the interdependencies as well as qualifying and quantifying the results to support the organization’s bottom line.

He focuses on virtualization and cloud technologies; application performance; hybrid cloud best practices; vehicles for IT application stacks such as containers, hypervisors, and cloud native best practices; DevOps conversations; converged infrastructure technologies; and data analytics. Yang is a past influencer at Spiceworks SpiceWorld and VMworld events.

He is also the owner of United States Patent 8,176,497 for an intelligent method to auto-scale VMs to fulfill peak database workloads. Yang’s past roles include a Cloud Practice Leader at Gravitant, a hybrid cloud software startup, and 13 years in various roles at Dell.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...