Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Linux Containers, Containers Expo Blog

@CloudExpo: Article

Benefits of an Enterprise-Class Server and Data Consolidation Solution

Many CIOs are rethinking how they architect their IT infrastructure to better deliver applications and services to end users

CIOs cited reducing enterprise costs; improving IT applications and infrastructure; improving efficiency; and improving business processes among the top 10 business priorities, according to a Gartner Executive Program Survey conducted last year.

To address these business priorities, many CIOs are rethinking how they architect their IT infrastructure to better deliver applications and services to end users. A study conducted by IDG Research Services[1] in 2013 surveyed 333 IT directors or higher titles at enterprise companies with more than 500 employees and reported that 77% of IT leaders globally believe data center transformation will play a highly important role in delivering business goals to their organizations.

Instead of allowing remote and branch offices to maintain the hardware and data, many enterprise CIOs today are centralizing storage and backups to their core data centers. This makes sense for the sake of simplicity and ensures IT managers adopt a ‘hands-on' approach to device management, which mitigates the security risk and complexity of having multiple backup systems and dispersed data sets. In addition, organizations with offices in sometimes unstable or physically challenging locations may not want any locally stored data in those offices, due to security concerns and potential data risks.

The reality of recovering from a disaster, whether natural or human-induced can be a daunting task in a typical branch office set-up. Most often data protection solutions are tape-based and can take days to recover and leave an organization exposed to delay and data loss. A typical branch disaster recovery requires not only physical hardware replacement, but a rebuild and patching of the operating system, reinstallation of applications, virus scanning, and full-data recovery prior to returning to service. Organizations that rely on weekly full and daily incremental backups of branch data face the additional challenge of restoring from multiple tapes and the loss of new data created between the time of the outage and the last captured backup.

Centralizing storage and backups to core data centers also minimizes travel expenses since data experts are rarely at the edge to provide support and management. According to IDG Research[2], 37% of organizations utilize non-IT staff to manage backups at remote branch offices, and this number grows to 67% when the branch office is located outside the United States. Centralizing these services to core data centers puts the data back in the hands of IT experts, rather than non-IT staff filling a role they are not equipped to manage.

Business Drivers
A significant driver for many organizations interested in storage consolidation, however, is cost. Companies want to maximize their investments in storage area networks (SANs) and realize the benefits of centralized storage. Chief among those benefits is achieving lower IT costs by eliminating the need to purchase and maintain local storage and server hardware. A new concept, which bridges this gap, is storage delivery technology.

CIOs are realizing that core business priorities can be addressed and solved, but they need to start by addressing core IT processes and transforming the data center.

As part of this movement, IT managers can expand storage capacity and look to extend the benefits of a consolidated approach to larger branch offices and data-intensive applications that previously were difficult or impossible to consolidate because of local performance requirements.

A Changing IT Landscape
Consolidation has the potential to empower businesses of all sizes to remove servers and data from branch offices and centralize them in the secure data center - without sacrificing user experience. This new architectural approach makes it possible to centralize backup operations and remove data from high-risk locations, while increasing agility and lowering the costs of managing remote office IT. High-risk locations can be any location that may be more susceptible to natural disasters, remote areas that are difficult to access, hacking, political turmoil, or even outright theft. To succeed in this dynamic environment, IT leaders need agility, security and control, while business users demand performance.

Identifying the optimal deployment location for IT assets such as servers and supporting storage systems is one of the more challenging aspects of the IT decision process today. When the edge of the enterprise and the core at the data center are linked together in an integrated solution, IT organizations can centralize control, security, and protection of distributed server and storage assets. This approach ensures timely access to (or recovery of) data and applications relied on by users across the extended organization while maximizing IT agility. Organizations can quickly adjust to changing conditions with the right information, delivered at the right time and in the right place to serve customers and partners better while keeping employees happy and productive.

Protection of Data
In-line with the growing volumes of data, enterprises should be able to decommission branch backup and recovery systems, shifting data protection operations to the secure data center. Enterprises are able to utilize their well-honed data center backup and recovery systems and procedures and skilled personnel to protect branch data.

Snapshots of a backup - a read-only copy of the data set at a point in time - are an integral part of ensuring your IT operations are running smoothly. In today's IT environment, administrators must be able to quickly set and assign hourly, daily, or weekly storage snapshot policies to ensure application-consistent data protection in conjunction with supported data center storage arrays.

Once storage snapshots are created in the data center, in addition to leveraging the disk snapshot for fast recovery, many organizations are replicating to a secondary data center or sending a copy to cloud storage environments to ensure data is located offsite.

Conclusion
Organizations' information bases and requirements are constantly changing in response to shifting customer demands and business requirements. Consolidation of systems to centralized data centers must make processes faster, less operationally intensive now and in the future. A well-thought-out IT architecture will make it easier to scale, making it simple to expand additional services to new user bases and locations. Finally, a solid IT architecture will make it easier for an enterprise to react quickly when disasters strike or as the needs of the organization change.

Resources

  1. "Gartner Executive Program Survey," Gartner, 2013
  2. "Riverbed Data Center Transformation Survey," IDG Research, 2013

More Stories By Gil Haberman

Gil Haberman is a group marketing manager at Riverbed Technology. Riverbed at more than $1 billion in annual revenue is the leader in Application Performance Infrastructure.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Enterprises are universally struggling to understand where the new tools and methodologies of DevOps fit into their organizations, and are universally making the same mistakes. These mistakes are not unavoidable, and in fact, avoiding them gifts an organization with sustained competitive advantage, just like it did for Japanese Manufacturing Post WWII.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Transformation Abstract Encryption and privacy in the cloud is a daunting yet essential task for both security practitioners and application developers, especially as applications continue moving to the cloud at an exponential rate. What are some best practices and processes for enterprises to follow that balance both security and ease of use requirements? What technologies are available to empower enterprises with code, data and key protection from cloud providers, system administrators, inside...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.