Welcome!

Blog Feed Post

Data Center Consolidation and Adopting Cloud Computing in 2013

Throughout 2012 large organizations and governments around the world continued to struggle with the idea of consolidating inefficient data centers, server closets, and individual “rogue” servers scattered around their enterprise or government agencies.  Issues dealt with the cost of operating data centers, disaster management of information technology resources, and of course human factors centered on control, power, or retention of jobs in a rapidly evolving IT industry.

Cloud computing and virtualization continue to have an impact on all consolidation discussions, not only from the standpoint of providing a much better model for managing physical assets, but also in the potential cloud offers to solve disaster recovery shortfalls, improve standardization, and encourage or enable development of service-oriented architectures.

Our involvement in projects ranging from local, state, and national government levels in both the United States and other countries indicates a consistent need for answering the following concerns:

  • Existing IT infrastructure, including both IT and facility, is reaching the end of its operational life
  • Collaboration requirements between internal and external users are expanding quickly, driving an architectural need for interoperability
  • Decision support systems require access to both raw data, and “big data/archival data”

We would like to see an effort within the IT community to move in the following directions:

  1. Real effort at decommissioning and eliminating inefficient data centers
  2. All data and applications should be fit into an enterprise architecture framework – regardless of the size of organization or data
  3. Aggressive development of standards supporting interoperability, portability, and reuse of objects and data

Regardless of the very public failures experienced by cloud service providers over the past year, the reality is cloud computing as an IT architecture and model is gaining traction, and is not likely to go away any time soon.  As with any emerging service or technology, cloud services will continue to develop and mature, reducing the impact and frequency of failures.

Future Data CentersWhy would an organization continue to buy individual high powered workstations, individual software licenses, and device-bound storage when the same application can be delivered to a simple display, or wide variety of displays, with standardized web-enabled cloud (SaaS) applications that store mission critical data images on a secure storage system at a secure site?  Why not facilitate the transition from CAPEX to OPEX, license to subscription, infrastructure to product and service development?

In reality, unless an organization is in the hardware or software development business, there is very little technical justification for building and managing a data center.  This includes secure facilities supporting military or other sensitive sites.

The cost of building and maintaining a data center, compared with either outsourcing into a commercial colocation site – or virtualizing data, applications, and network access requirements has gained the attention of CFOs and CEOs, requiring IT managers to more explicitly justify the cost of building internal infrastructure vs. outsourcing.  This is quickly becoming a very difficult task.

Money spent on a data center infrastructure is lost to the organization.  The cost of labor is high, the cost of energy, space, and maintenance is high.  Mooney that could be better applied to product and service development, customer service capacity, or other revenue and customer-facing activities.

The Bandwidth Factor

The one major limitation the IT community will need to overcome as data center consolidation continues and cloud services become the ‘norm, is bandwidth.  Applications, such as streaming video, unified communications, and data intensive applications will need more bandwidth.  The telecom companies are making progress, having deployed 100gbps backbone capacity in many markets.  However this capacity will need to continue growing quickly to meet the needs of organizations needing to access data and applications stored or hosted within a virtual or cloud computing environment.

Consider a national government’s IT requirements.  If the government, like most, are based within a metro area.  The agencies and departments consolidate their individual data centers and server closets into a central or reduced number of facilities.   Government interoperability frameworks begin to make small steps allowing cross-agency data sharing, and individual users need access to a variety of applications and data sources needed to fulfill their decision support requirements.

For example, a GIS (Geospatial/Geographic Information System) with multiple demographic or other overlays.  Individual users will need to display data that may be drawn from several data sources, through GIS applications, and display a large amount of complex data on individual display screens.  Without broadband access between both the user and application, as well as application and data sources, the result will be a very poor user experience.

Another example is using the capabilities of video conferencing, desktop sharing, and interactive persistent-state application sharing.  Without adequate bandwidth this is simply not possible.

Revisiting the “4th Utility” for 2013

The final vision on the 2013 “wishlist” is that we, as an IT industry, continue to acknowledge the need for developing the 4th Utility.  This is the idea that broadband communications, processing capacity (including SaaS applications), and storage is the right of all citizens.  Much like the first three utilities, roads, water, and electricity, the 4th Utility must be a basic part of all discussions related to national, state, or local infrastructure discussions.  As we move into the next millennium, Internet-enabled, or something like Internet-enabled communications will be an essential part of all our lives.

The 4th Utility requires high capacity fiber optic infrastructure and broadband wireless be delivered to any location within the country which supports a community or individual connected to a community.   We’ll have to [pay a fee to access the utility (same as other utilities), but it is our right and obligation to deliver the utility.

2013 will be a lot of fun for us in the IT industry.  Cloud computing is going to impact everybody – one way or the other.  Individual data centers will continue to close.  Service-oriented architectures, enterprise architecture, process modeling, and design efficiency will drive a lot of innovation.   – We’ll lose some players, gain players, and and we’ll be in a better position at the end of 2013 than today.


Read the original blog entry...

More Stories By John Savageau

John Savageau is a life long telecom and Internet geek, with a deep interest in the environment and all things green. Whether drilling into the technology of human communications, cloud computing, or describing a blue whale off Catalina Island, Savageau will try to present complex ideas in terms that are easily appreciated and understood.

Savageau is currently focusing efforts on data center consolidation strategies, enterprise architectures, and cloud computing migration planning in developing countries, including Azerbaijan, The Philippines, Palestine, Indonesia, Moldova, Egypt, and Vietnam.

John Savageau is President of Pacific-Tier Communications dividing time between Honolulu and Burbank, California.

A former career US Air Force officer, Savageau graduated with a Master of Science degree in Operations Management from the University of Arkansas and also received Bachelor of Arts degrees in Asian Studies and Information Systems Management from the University of Maryland.

Latest Stories
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
The IoT has the potential to create a renaissance of manufacturing in the US and elsewhere. In his session at 18th Cloud Expo, Florent Solt, CTO and chief architect of Netvibes, will discuss how the expected exponential increase in the amount of data that will be processed, transported, stored, and accessed means there will be a huge demand for smart technologies to deliver it. Florent Solt is the CTO and chief architect of Netvibes. Prior to joining Netvibes in 2007, he co-founded Rift Technol...
If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your busines...
Struggling to keep up with increasing application demand? Learn how Platform as a Service (PaaS) can streamline application development processes and make resource management easy.
New Relic, Inc. has announced a set of new features across the New Relic Software Analytics Cloud that offer IT operations teams increased visibility, and the ability to diagnose and resolve performance problems quickly. The new features further IT operations teams’ ability to leverage data and analytics, as well as drive collaboration and a common, shared understanding between teams. Software teams are under pressure to resolve performance issues quickly and improve availability, as the comple...
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, will draw upon their own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He will also discuss the implementation of microservices in data and applicat...
Join IBM June 8 at 18th Cloud Expo at the Javits Center in New York City, NY, and learn how to innovate like a startup and scale for the enterprise. You need to deliver quality applications faster and cheaper, attract and retain customers with an engaging experience across devices, and seamlessly integrate your enterprise systems. And you can't take 12 months to do it.
See storage differently! Storage performance problems have only gotten worse and harder to solve as applications have become largely virtualized and moved to a cloud-based infrastructure. Storage performance in a virtualized environment is not just about IOPS, it is about how well that potential performance is guaranteed to individual VMs for these apps as the number of VMs keep going up real time. In his session at 18th Cloud Expo, Dhiraj Sehgal, in product and marketing at Tintri, will discu...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, will discuss how research has demonstrated the value of Machine Learning in delivering next generation analytics to im...
This is not a small hotel event. It is also not a big vendor party where politicians and entertainers are more important than real content. This is Cloud Expo, the world's longest-running conference and exhibition focused on Cloud Computing and all that it entails. If you want serious presentations and valuable insight about Cloud Computing for three straight days, then register now for Cloud Expo.
As you respond to increasing requests for new analytics, you need fast and flexible technology in your arsenal so that you can deploy the right workload to the right platform for the need at hand. Do you need self-service and fast time to value? Do you have data and application control and privacy needs, along with strict SLAs to meet? IBM dashDB™ is data warehouse technology powered by in-memory computing and in-database analytics that are designed for fast results, scalability and more.
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
IoT device adoption is growing at staggering rates, and with it comes opportunity for developers to meet consumer demand for an ever more connected world. Wireless communication is the key part of the encompassing components of any IoT device. Wireless connectivity enhances the device utility at the expense of ease of use and deployment challenges. Since connectivity is fundamental for IoT device development, engineers must understand how to overcome the hurdles inherent in incorporating multipl...
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...