Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @BigDataExpo

@CloudExpo: Blog Feed Post

Shifting the Storage Paradigm | Part One: The Evolution of Data

Why object storage is the prevalent platform for scale-out storage infrastructures

The storage industry is going through a big paradigm shift caused by drastic changes in how we generate and consume data. As a result, we also have to drastically change how we store data: the market needs massive, online storage pools that can be accessed from anywhere and anytime. Object Storage has emerged as a solution to meet the changing needs of the market and it is currently a hot topic as it creates opportunities for new revenue streams.

In this three-part blog series, I will explore how storage has changed – creating a need for new methodologies – and why object storage is the prevalent platform for scale-out storage infrastructures.

To understand how storage has changed, let’s take a look at how data has evolved over the past three decades, paying special attention to data generation and consumption.

Transactional Data
In the 1980s and 1990s, the most valuable digital data was transactional data – database records, created and accessed through database applications. This led to the success of large database and database application companies. Transactional data continues to be important today, but there are no signs on the horizon that database solutions won’t be able to manage the – relatively slow – growth of structured information. From a storage point of view, the structured data challenge is handled well by block-based (SAN) storage platforms, designed to deliver the high IOPS needed to run large enterprise databases.

Unstructured Data
With the advent of the office suite, unstructured data became much more important than it had ever been before. Halfway the 1990s, every office worker had a desktop computer with an office suite. E-mail allowed us to send those files around; storage consumption went through the roof. Enterprises would soon be challenged to build shared file storage infrastructures – backup and archiving became another challenge. Tiered storage was born. Storage was both hot and cool. In the next two decades we would see plenty of innovations to manage fast-growing unstructured data sets – the file storage (NAS) industry skyrocketed.

But people can only generate so many office documents. The average Powerpoint file is probably three times as big today as it was back in 1999, but that is not even close to data growth predictions we continue to hear (x2 every year). Just like SANs have evolved sufficiently to cope with the changing database requirements, NAS platforms would have been able to cope with the growth of unstructured data if it weren’t for the sensor-induced Big Data evolution of the past decade.

Big Data
The first mentions of Big Data refer to what we now understand as Big Data Analytics: scientists (mostly) were challenged to store research data from innovative information-sensing devices, captured for analytics purposes. Traditional databases would not scale sufficiently for this data, so alternative methods were needed. This led to innovations like Hadoop/MapReduce, which we also like to refer to as Big “semi-structured” Data: the data is not structured as in a database, but it is not really unstructured either.

Bigger Data
Information-sensing devices are not exclusive to scientific analytics environments, however. Smartphones, tablets, photo cameras and scanners – just to name a few – are all information-sensing devices that create the vast majority of all unstructured information generated today. In the past decade we have not only seen a massive increase in the popularity of these devices, but also continuous quality improvements. This led to more and bigger data.  The result of this is a true data explosion of mostly immutable data: contrary to office documents, most of the sensor data is never changed.

This immutable nature of unstructured data holds the key to solving the scalability problem of traditional file storage. Tune into my next posts, where I will dive into how to leverage this aspect of enterprise data to develop an object storage solution for the shifting storage paradigm.

More Stories By Tom Leyden

Tom Leyden is VP Product Marketing at Scality. Scality was founded in 2009 by a team of entrepreneurs and technologists. The idea wasn’t storage, per se. When the Scality team talked to the initial base of potential customers, the customers wanted a system that could “route” data to and from individual users in the most scalable, efficient way possible. And so began a non-traditional approach to building a storage system that no one had imagined before. No one thought an object store could have enough performance for all the files and attachments of millions of users. No one thought a system could remain up and running through software upgrades, hardware failures, capacity expansions, and even multiple hardware generations coexisting. And no one believed you could do all this and scale to petabytes of content and billions of objects in pure software.

Latest Stories
Silver Spring Networks, Inc. (NYSE: SSNI) extended its Internet of Things technology platform with performance enhancements to Gen5 – its fifth generation critical infrastructure networking platform. Already delivering nearly 23 million devices on five continents as one of the leading networking providers in the market, Silver Spring announced it is doubling the maximum speed of its Gen5 network to up to 2.4 Mbps, increasing computational performance by 10x, supporting simultaneous mesh communic...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...