Click here to close now.




















Welcome!

Related Topics: @ThingsExpo, @CloudExpo

@ThingsExpo: Article

The IoT: Zettabytes Approaching

IBM's Initiative to Build 15 New Datacenters Brings Things Into Focus

"None of us really understands what's going on with all these numbers." Thus said David Stockman, the then-wunderkind budget director for newly elected President Ronald Reagan in 1981.

Stockman was widely ridiculed for such a rare burst of candor from a government official. He was referring to the administration's efforts to grapple with the major budget and tax reforms candidate Reagan had promised the year before.

I think it's fair enough to use these words as a basis for what's going on in the commingling worlds of Cloud Computing, Big Data, and the Internet of Things (IoT).

I've promised to write about all that's happening with IoT between now and @ThingsExpo June 10-12 in New York, an event for which I serve as Conference Chair.

Zettabytes Take the Stage
But first I have to get a grip on what's going on with all these numbers.

Let's start with a prediction by CSC, the Washington, DC-area IT services provider. I'm reading one of its infographics that alleges Big Data will cause global data storage needs to increase 44 times by 2020, reaching 35 zettabytes. (It says we had .79 zettabyte under control in 2009.)

"Only" 10.5 zettabytes of the 2020 total will be generated by enterprises, according to CSC. But thanks to the cloud, 28 zettabytes will be managed by enterprises.

Break it Down
Let's break this down by imagining a zettabyte. I, for one, am still not comfortable visualizing, abstracting, or using that term. A zettabyte is 1 billion terabytes, 1 million petabytes, or 1,000 exabytes.

Yes, so take today's typical 1 terabyte personal-computing hard drive (worth about $80) and multiply that by a billion to get a single zettabyte. Now imagine storing 28 of them.

The bandwidth requirements for this amount of data will be similarly daunting. If only 1% of that data were zipping around per second, we'll more than 2 trillion gigabit connections to make it happen.

(1% of 28 zettabytes = 280 exabytes = 280 million terabytes = 280 billion gigabytes = 2.4 trillion gigabits.)

We're going to need a bigger boat.

Many Big Datacenters
When we apply the 28-zettabyte figure to datacenters, the initial calculations are equally shocking. This is a relevant calculation in the wake of the recent news that IBM plans to build 15 new datacenters at a cost of $1.2 billion.

That's $80 million per datacenter, a modest number in the datacenter world, and one which will result in an average facility encompassing about 8,000 computers, 80,000 square feet, and perhaps 0.8 exabyte of storage.

To reach 28 zettabytes, we would need only 35,000 of these datacenters in the world. Using IBM's budget for its new datacenter initiative, total cost would come in at 35,000 x $80 million, or $2.8 trillion. If, say, one quarter of them were built in the US, we'd see one every 15 miles or so driving down any road.

Oh, now we have to add about 84,000 megawatts to the electrical grid, which shouldn't require more than around 50 large power plants, whether nuclear or natural-gas. There's also the matter of water usage for cooling, to be measured in the billions of gallons per day.

Can It Happen?
Moore's Law can be expected to work its magic between now and 2020, and the good news is that storage costs have been moving on a curve steeper than Moore's Law. So in the end, these numbers may not be so eye-poppingly large.

But it's clear the global engineering challenge (and opportunity) related to cloud computing, Big Data, and the IoT is an enormous opportunity. Let's forget for a few seconds what revenue might be generated for software and services companies. Let's forget what value might be added to national economies by new business and new productivity levels.

The US Interstate Highway system was built for $400 billion in current dollars, give or take. The global Information Superhighway (yes, let's bring back that term!) is several times larger, Moore's Law notwithstanding.

But can it happen? Do we have the societal will to build this 21st century hive intelligence?

This is where our friends the politicians must eradicate their collective Anaproctocephalogical Syndrome and do some good for humanity.

The US in particular could be - could be - a leader in open, global communications by ending its "possess the haystack to find the needle" approach to spying on everybody and their brother and your Aunt Maude. Recent remarks by President Obama give me little present hope.

Because the CC/BD/IoT challenge is as much a socio-political challenge as it is an engineering and economic challenge.

Optimism, Pessimism, or Reality?
The numbers I played with here serve as a general indicator of what it is we have, unwittingly or not, set upon with our wondrous machines. The real numbers will play out over time. In any case, we are on the cusp of transformational change.

IBM's SVP of Global Technology Services Erich Clementi (pictured), writing in his blog about the company's new datacenter initiative, touts IBM's commitment to "robust global networks of datacenters."

Clementi also enthuses, "cloud computing is a fabric that will knit the entire world closer together-businesses, economies and people. A lot of good will come of it. But, first, we have to build a robust global network of cloud data centers to turn that promise into reality."

Yes, if all this data can continue to flow among borders relatively easily and peacefully (as email and website information have for some time now), there is hope for all nations of the world to improve themselves through the transformational change wrought by mobility, sensors, and the ongoing social-media revolution.

If not, if instead national firewalls become common to keep the US government out, and we end up living on a globe of re-isolated nations, then all these numbers mean less than zero. No zettabytes for you.

Contact Me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Moving an existing on-premise infrastructure into the cloud can be a complex and daunting proposition. It is critical to understand the benefits as well as the challenges associated with either a full or hybrid approach. In his session at 17th Cloud Expo, Richard Weiss, Principal Consultant at Pythian, will present a roadmap that can be leveraged by any organization to plan, analyze, evaluate and execute on a cloud migration solution. He will review the five major cloud transformation phases a...
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing thes...
eCube Systems has released NXTmonitor, a full featured application orchestration solution. NXTmonitor, which inherited the code base of NXTminder, has been extended to support multi-discipline processes and will act as a DevOps utility in a heterogeneous enterprise environment. Previously, NXTminder was packaged with NXTera middleware to configure and manage Entera and NXTera RPC servers. “Since we are widening the focus of this solution to DevOps, we felt the need to change the name to NXTmon...
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for ‘normal’ companies without hyper-scale resources. In his session at 17th Cloud Expo, David Cauthron, founder and chief executive officer of Nimboxx, will discuss the evolution of virtualization (hardware, application, memory, storage) and how commodity / open source hyper converged infrastructure (HCI) so...
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device acce...
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Enterprises can achieve rigorous IT security as well as improved DevOps practices and Cloud economics by taking a new, cloud-native approach to application delivery. Because the attack surface for cloud applications is dramatically different than for highly controlled data centers, a disciplined and multi-layered approach that spans all of your processes, staff, vendors and technologies is required. This may sound expensive and time consuming to achieve as you plan how to move selected applicati...
Containers are not new, but renewed commitments to performance, flexibility, and agility have propelled them to the top of the agenda today. By working without the need for virtualization and its overhead, containers are seen as the perfect way to deploy apps and services across multiple clouds. Containers can handle anything from file types to operating systems and services, including microservices. What are microservices? Unlike what the name implies, microservices are not necessarily small,...
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., will focus on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what...
Red Hat is investing in Tesora, the number one contributor to OpenStack Trove Database as a Service (DBaaS) also ranked among the top 20 companies contributing to OpenStack overall. Tesora, the company bringing OpenStack Trove Database as a Service (DBaaS) to the enterprise, has announced that Red Hat and others have invested in the company as a part of Tesora's latest funding round. The funding agreement expands on the ongoing collaboration between Tesora and Red Hat, which dates back to Febr...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Trel...