Related Topics: @ThingsExpo, @CloudExpo

@ThingsExpo: Article

The IoT: Zettabytes Approaching

IBM's Initiative to Build 15 New Datacenters Brings Things Into Focus

"None of us really understands what's going on with all these numbers." Thus said David Stockman, the then-wunderkind budget director for newly elected President Ronald Reagan in 1981.

Stockman was widely ridiculed for such a rare burst of candor from a government official. He was referring to the administration's efforts to grapple with the major budget and tax reforms candidate Reagan had promised the year before.

I think it's fair enough to use these words as a basis for what's going on in the commingling worlds of Cloud Computing, Big Data, and the Internet of Things (IoT).

I've promised to write about all that's happening with IoT between now and @ThingsExpo June 10-12 in New York, an event for which I serve as Conference Chair.

Zettabytes Take the Stage
But first I have to get a grip on what's going on with all these numbers.

Let's start with a prediction by CSC, the Washington, DC-area IT services provider. I'm reading one of its infographics that alleges Big Data will cause global data storage needs to increase 44 times by 2020, reaching 35 zettabytes. (It says we had .79 zettabyte under control in 2009.)

"Only" 10.5 zettabytes of the 2020 total will be generated by enterprises, according to CSC. But thanks to the cloud, 28 zettabytes will be managed by enterprises.

Break it Down
Let's break this down by imagining a zettabyte. I, for one, am still not comfortable visualizing, abstracting, or using that term. A zettabyte is 1 billion terabytes, 1 million petabytes, or 1,000 exabytes.

Yes, so take today's typical 1 terabyte personal-computing hard drive (worth about $80) and multiply that by a billion to get a single zettabyte. Now imagine storing 28 of them.

The bandwidth requirements for this amount of data will be similarly daunting. If only 1% of that data were zipping around per second, we'll more than 2 trillion gigabit connections to make it happen.

(1% of 28 zettabytes = 280 exabytes = 280 million terabytes = 280 billion gigabytes = 2.4 trillion gigabits.)

We're going to need a bigger boat.

Many Big Datacenters
When we apply the 28-zettabyte figure to datacenters, the initial calculations are equally shocking. This is a relevant calculation in the wake of the recent news that IBM plans to build 15 new datacenters at a cost of $1.2 billion.

That's $80 million per datacenter, a modest number in the datacenter world, and one which will result in an average facility encompassing about 8,000 computers, 80,000 square feet, and perhaps 0.8 exabyte of storage.

To reach 28 zettabytes, we would need only 35,000 of these datacenters in the world. Using IBM's budget for its new datacenter initiative, total cost would come in at 35,000 x $80 million, or $2.8 trillion. If, say, one quarter of them were built in the US, we'd see one every 15 miles or so driving down any road.

Oh, now we have to add about 84,000 megawatts to the electrical grid, which shouldn't require more than around 50 large power plants, whether nuclear or natural-gas. There's also the matter of water usage for cooling, to be measured in the billions of gallons per day.

Can It Happen?
Moore's Law can be expected to work its magic between now and 2020, and the good news is that storage costs have been moving on a curve steeper than Moore's Law. So in the end, these numbers may not be so eye-poppingly large.

But it's clear the global engineering challenge (and opportunity) related to cloud computing, Big Data, and the IoT is an enormous opportunity. Let's forget for a few seconds what revenue might be generated for software and services companies. Let's forget what value might be added to national economies by new business and new productivity levels.

The US Interstate Highway system was built for $400 billion in current dollars, give or take. The global Information Superhighway (yes, let's bring back that term!) is several times larger, Moore's Law notwithstanding.

But can it happen? Do we have the societal will to build this 21st century hive intelligence?

This is where our friends the politicians must eradicate their collective Anaproctocephalogical Syndrome and do some good for humanity.

The US in particular could be - could be - a leader in open, global communications by ending its "possess the haystack to find the needle" approach to spying on everybody and their brother and your Aunt Maude. Recent remarks by President Obama give me little present hope.

Because the CC/BD/IoT challenge is as much a socio-political challenge as it is an engineering and economic challenge.

Optimism, Pessimism, or Reality?
The numbers I played with here serve as a general indicator of what it is we have, unwittingly or not, set upon with our wondrous machines. The real numbers will play out over time. In any case, we are on the cusp of transformational change.

IBM's SVP of Global Technology Services Erich Clementi (pictured), writing in his blog about the company's new datacenter initiative, touts IBM's commitment to "robust global networks of datacenters."

Clementi also enthuses, "cloud computing is a fabric that will knit the entire world closer together-businesses, economies and people. A lot of good will come of it. But, first, we have to build a robust global network of cloud data centers to turn that promise into reality."

Yes, if all this data can continue to flow among borders relatively easily and peacefully (as email and website information have for some time now), there is hope for all nations of the world to improve themselves through the transformational change wrought by mobility, sensors, and the ongoing social-media revolution.

If not, if instead national firewalls become common to keep the US government out, and we end up living on a globe of re-isolated nations, then all these numbers mean less than zero. No zettabytes for you.

Contact Me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Latest Stories
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
Governments around the world are adopting Safe Harbor privacy provisions to protect customer data from leaving sovereign territories. Increasingly, global companies are required to create new instances of their server clusters in multiple countries to keep abreast of these new Safe Harbor laws. Is it worth it? In his session at 19th Cloud Expo, Adam Rogers, Managing Director of Anexia, Inc., will discuss how to keep your data legal and still stay in business.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Successful transition from traditional IT to cloud computing requires three key ingredients: an IT architecture that allows companies to extend their internal best practices to the cloud, a cost point that allows economies of scale, and automated processes that manage risk exposure and maintain regulatory compliance with industry regulations (FFIEC, PCI-DSS, HIPAA, FISMA). The unique combination of VMware, the IBM Cloud, and Cloud Raxak, a 2016 Gartner Cool Vendor in IT Automation, provides a co...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
"Avere Systems is a hybrid cloud solution provider. We have customers that want to use cloud storage and we have customers that want to take advantage of cloud compute," explained Rebecca Thompson, VP of Marketing at Avere Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...