Welcome!

Blog Feed Post

Data Center Consolidation and Adopting Cloud Computing in 2013

Throughout 2012 large organizations and governments around the world continued to struggle with the idea of consolidating inefficient data centers, server closets, and individual “rogue” servers scattered around their enterprise or government agencies.  Issues dealt with the cost of operating data centers, disaster management of information technology resources, and of course human factors centered on control, power, or retention of jobs in a rapidly evolving IT industry.

Cloud computing and virtualization continue to have an impact on all consolidation discussions, not only from the standpoint of providing a much better model for managing physical assets, but also in the potential cloud offers to solve disaster recovery shortfalls, improve standardization, and encourage or enable development of service-oriented architectures.

Our involvement in projects ranging from local, state, and national government levels in both the United States and other countries indicates a consistent need for answering the following concerns:

  • Existing IT infrastructure, including both IT and facility, is reaching the end of its operational life
  • Collaboration requirements between internal and external users are expanding quickly, driving an architectural need for interoperability
  • Decision support systems require access to both raw data, and “big data/archival data”

We would like to see an effort within the IT community to move in the following directions:

  1. Real effort at decommissioning and eliminating inefficient data centers
  2. All data and applications should be fit into an enterprise architecture framework – regardless of the size of organization or data
  3. Aggressive development of standards supporting interoperability, portability, and reuse of objects and data

Regardless of the very public failures experienced by cloud service providers over the past year, the reality is cloud computing as an IT architecture and model is gaining traction, and is not likely to go away any time soon.  As with any emerging service or technology, cloud services will continue to develop and mature, reducing the impact and frequency of failures.

Future Data CentersWhy would an organization continue to buy individual high powered workstations, individual software licenses, and device-bound storage when the same application can be delivered to a simple display, or wide variety of displays, with standardized web-enabled cloud (SaaS) applications that store mission critical data images on a secure storage system at a secure site?  Why not facilitate the transition from CAPEX to OPEX, license to subscription, infrastructure to product and service development?

In reality, unless an organization is in the hardware or software development business, there is very little technical justification for building and managing a data center.  This includes secure facilities supporting military or other sensitive sites.

The cost of building and maintaining a data center, compared with either outsourcing into a commercial colocation site – or virtualizing data, applications, and network access requirements has gained the attention of CFOs and CEOs, requiring IT managers to more explicitly justify the cost of building internal infrastructure vs. outsourcing.  This is quickly becoming a very difficult task.

Money spent on a data center infrastructure is lost to the organization.  The cost of labor is high, the cost of energy, space, and maintenance is high.  Mooney that could be better applied to product and service development, customer service capacity, or other revenue and customer-facing activities.

The Bandwidth Factor

The one major limitation the IT community will need to overcome as data center consolidation continues and cloud services become the ‘norm, is bandwidth.  Applications, such as streaming video, unified communications, and data intensive applications will need more bandwidth.  The telecom companies are making progress, having deployed 100gbps backbone capacity in many markets.  However this capacity will need to continue growing quickly to meet the needs of organizations needing to access data and applications stored or hosted within a virtual or cloud computing environment.

Consider a national government’s IT requirements.  If the government, like most, are based within a metro area.  The agencies and departments consolidate their individual data centers and server closets into a central or reduced number of facilities.   Government interoperability frameworks begin to make small steps allowing cross-agency data sharing, and individual users need access to a variety of applications and data sources needed to fulfill their decision support requirements.

For example, a GIS (Geospatial/Geographic Information System) with multiple demographic or other overlays.  Individual users will need to display data that may be drawn from several data sources, through GIS applications, and display a large amount of complex data on individual display screens.  Without broadband access between both the user and application, as well as application and data sources, the result will be a very poor user experience.

Another example is using the capabilities of video conferencing, desktop sharing, and interactive persistent-state application sharing.  Without adequate bandwidth this is simply not possible.

Revisiting the “4th Utility” for 2013

The final vision on the 2013 “wishlist” is that we, as an IT industry, continue to acknowledge the need for developing the 4th Utility.  This is the idea that broadband communications, processing capacity (including SaaS applications), and storage is the right of all citizens.  Much like the first three utilities, roads, water, and electricity, the 4th Utility must be a basic part of all discussions related to national, state, or local infrastructure discussions.  As we move into the next millennium, Internet-enabled, or something like Internet-enabled communications will be an essential part of all our lives.

The 4th Utility requires high capacity fiber optic infrastructure and broadband wireless be delivered to any location within the country which supports a community or individual connected to a community.   We’ll have to [pay a fee to access the utility (same as other utilities), but it is our right and obligation to deliver the utility.

2013 will be a lot of fun for us in the IT industry.  Cloud computing is going to impact everybody – one way or the other.  Individual data centers will continue to close.  Service-oriented architectures, enterprise architecture, process modeling, and design efficiency will drive a lot of innovation.   – We’ll lose some players, gain players, and and we’ll be in a better position at the end of 2013 than today.


Read the original blog entry...

More Stories By John Savageau

John Savageau is a life long telecom and Internet geek, with a deep interest in the environment and all things green. Whether drilling into the technology of human communications, cloud computing, or describing a blue whale off Catalina Island, Savageau will try to present complex ideas in terms that are easily appreciated and understood.

Savageau is currently focusing efforts on data center consolidation strategies, enterprise architectures, and cloud computing migration planning in developing countries, including Azerbaijan, The Philippines, Palestine, Indonesia, Moldova, Egypt, and Vietnam.

John Savageau is President of Pacific-Tier Communications dividing time between Honolulu and Burbank, California.

A former career US Air Force officer, Savageau graduated with a Master of Science degree in Operations Management from the University of Arkansas and also received Bachelor of Arts degrees in Asian Studies and Information Systems Management from the University of Maryland.

Latest Stories
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.