Welcome!

Related Topics: @ThingsExpo, @CloudExpo

@ThingsExpo: Article

The IoT: Zettabytes Approaching

IBM's Initiative to Build 15 New Datacenters Brings Things Into Focus

"None of us really understands what's going on with all these numbers." Thus said David Stockman, the then-wunderkind budget director for newly elected President Ronald Reagan in 1981.

Stockman was widely ridiculed for such a rare burst of candor from a government official. He was referring to the administration's efforts to grapple with the major budget and tax reforms candidate Reagan had promised the year before.

I think it's fair enough to use these words as a basis for what's going on in the commingling worlds of Cloud Computing, Big Data, and the Internet of Things (IoT).

I've promised to write about all that's happening with IoT between now and @ThingsExpo June 10-12 in New York, an event for which I serve as Conference Chair.

Zettabytes Take the Stage
But first I have to get a grip on what's going on with all these numbers.

Let's start with a prediction by CSC, the Washington, DC-area IT services provider. I'm reading one of its infographics that alleges Big Data will cause global data storage needs to increase 44 times by 2020, reaching 35 zettabytes. (It says we had .79 zettabyte under control in 2009.)

"Only" 10.5 zettabytes of the 2020 total will be generated by enterprises, according to CSC. But thanks to the cloud, 28 zettabytes will be managed by enterprises.

Break it Down
Let's break this down by imagining a zettabyte. I, for one, am still not comfortable visualizing, abstracting, or using that term. A zettabyte is 1 billion terabytes, 1 million petabytes, or 1,000 exabytes.

Yes, so take today's typical 1 terabyte personal-computing hard drive (worth about $80) and multiply that by a billion to get a single zettabyte. Now imagine storing 28 of them.

The bandwidth requirements for this amount of data will be similarly daunting. If only 1% of that data were zipping around per second, we'll more than 2 trillion gigabit connections to make it happen.

(1% of 28 zettabytes = 280 exabytes = 280 million terabytes = 280 billion gigabytes = 2.4 trillion gigabits.)

We're going to need a bigger boat.

Many Big Datacenters
When we apply the 28-zettabyte figure to datacenters, the initial calculations are equally shocking. This is a relevant calculation in the wake of the recent news that IBM plans to build 15 new datacenters at a cost of $1.2 billion.

That's $80 million per datacenter, a modest number in the datacenter world, and one which will result in an average facility encompassing about 8,000 computers, 80,000 square feet, and perhaps 0.8 exabyte of storage.

To reach 28 zettabytes, we would need only 35,000 of these datacenters in the world. Using IBM's budget for its new datacenter initiative, total cost would come in at 35,000 x $80 million, or $2.8 trillion. If, say, one quarter of them were built in the US, we'd see one every 15 miles or so driving down any road.

Oh, now we have to add about 84,000 megawatts to the electrical grid, which shouldn't require more than around 50 large power plants, whether nuclear or natural-gas. There's also the matter of water usage for cooling, to be measured in the billions of gallons per day.

Can It Happen?
Moore's Law can be expected to work its magic between now and 2020, and the good news is that storage costs have been moving on a curve steeper than Moore's Law. So in the end, these numbers may not be so eye-poppingly large.

But it's clear the global engineering challenge (and opportunity) related to cloud computing, Big Data, and the IoT is an enormous opportunity. Let's forget for a few seconds what revenue might be generated for software and services companies. Let's forget what value might be added to national economies by new business and new productivity levels.

The US Interstate Highway system was built for $400 billion in current dollars, give or take. The global Information Superhighway (yes, let's bring back that term!) is several times larger, Moore's Law notwithstanding.

But can it happen? Do we have the societal will to build this 21st century hive intelligence?

This is where our friends the politicians must eradicate their collective Anaproctocephalogical Syndrome and do some good for humanity.

The US in particular could be - could be - a leader in open, global communications by ending its "possess the haystack to find the needle" approach to spying on everybody and their brother and your Aunt Maude. Recent remarks by President Obama give me little present hope.

Because the CC/BD/IoT challenge is as much a socio-political challenge as it is an engineering and economic challenge.

Optimism, Pessimism, or Reality?
The numbers I played with here serve as a general indicator of what it is we have, unwittingly or not, set upon with our wondrous machines. The real numbers will play out over time. In any case, we are on the cusp of transformational change.

IBM's SVP of Global Technology Services Erich Clementi (pictured), writing in his blog about the company's new datacenter initiative, touts IBM's commitment to "robust global networks of datacenters."

Clementi also enthuses, "cloud computing is a fabric that will knit the entire world closer together-businesses, economies and people. A lot of good will come of it. But, first, we have to build a robust global network of cloud data centers to turn that promise into reality."

Yes, if all this data can continue to flow among borders relatively easily and peacefully (as email and website information have for some time now), there is hope for all nations of the world to improve themselves through the transformational change wrought by mobility, sensors, and the ongoing social-media revolution.

If not, if instead national firewalls become common to keep the US government out, and we end up living on a globe of re-isolated nations, then all these numbers mean less than zero. No zettabytes for you.

Contact Me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
With over 720 million Internet users and 40–50% CAGR, the Chinese Cloud Computing market has been booming. When talking about cloud computing, what are the Chinese users of cloud thinking about? What is the most powerful force that can push them to make the buying decision? How to tap into them? In his session at 18th Cloud Expo, Yu Hao, CEO and co-founder of SpeedyCloud, answered these questions and discussed the results of SpeedyCloud’s survey.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?