Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @BigDataExpo

@CloudExpo: Blog Feed Post

Shifting the Storage Paradigm | Part One: The Evolution of Data

Why object storage is the prevalent platform for scale-out storage infrastructures

The storage industry is going through a big paradigm shift caused by drastic changes in how we generate and consume data. As a result, we also have to drastically change how we store data: the market needs massive, online storage pools that can be accessed from anywhere and anytime. Object Storage has emerged as a solution to meet the changing needs of the market and it is currently a hot topic as it creates opportunities for new revenue streams.

In this three-part blog series, I will explore how storage has changed – creating a need for new methodologies – and why object storage is the prevalent platform for scale-out storage infrastructures.

To understand how storage has changed, let’s take a look at how data has evolved over the past three decades, paying special attention to data generation and consumption.

Transactional Data
In the 1980s and 1990s, the most valuable digital data was transactional data – database records, created and accessed through database applications. This led to the success of large database and database application companies. Transactional data continues to be important today, but there are no signs on the horizon that database solutions won’t be able to manage the – relatively slow – growth of structured information. From a storage point of view, the structured data challenge is handled well by block-based (SAN) storage platforms, designed to deliver the high IOPS needed to run large enterprise databases.

Unstructured Data
With the advent of the office suite, unstructured data became much more important than it had ever been before. Halfway the 1990s, every office worker had a desktop computer with an office suite. E-mail allowed us to send those files around; storage consumption went through the roof. Enterprises would soon be challenged to build shared file storage infrastructures – backup and archiving became another challenge. Tiered storage was born. Storage was both hot and cool. In the next two decades we would see plenty of innovations to manage fast-growing unstructured data sets – the file storage (NAS) industry skyrocketed.

But people can only generate so many office documents. The average Powerpoint file is probably three times as big today as it was back in 1999, but that is not even close to data growth predictions we continue to hear (x2 every year). Just like SANs have evolved sufficiently to cope with the changing database requirements, NAS platforms would have been able to cope with the growth of unstructured data if it weren’t for the sensor-induced Big Data evolution of the past decade.

Big Data
The first mentions of Big Data refer to what we now understand as Big Data Analytics: scientists (mostly) were challenged to store research data from innovative information-sensing devices, captured for analytics purposes. Traditional databases would not scale sufficiently for this data, so alternative methods were needed. This led to innovations like Hadoop/MapReduce, which we also like to refer to as Big “semi-structured” Data: the data is not structured as in a database, but it is not really unstructured either.

Bigger Data
Information-sensing devices are not exclusive to scientific analytics environments, however. Smartphones, tablets, photo cameras and scanners – just to name a few – are all information-sensing devices that create the vast majority of all unstructured information generated today. In the past decade we have not only seen a massive increase in the popularity of these devices, but also continuous quality improvements. This led to more and bigger data.  The result of this is a true data explosion of mostly immutable data: contrary to office documents, most of the sensor data is never changed.

This immutable nature of unstructured data holds the key to solving the scalability problem of traditional file storage. Tune into my next posts, where I will dive into how to leverage this aspect of enterprise data to develop an object storage solution for the shifting storage paradigm.

More Stories By Tom Leyden

Tom Leyden is VP Product Marketing at Scality. Scality was founded in 2009 by a team of entrepreneurs and technologists. The idea wasn’t storage, per se. When the Scality team talked to the initial base of potential customers, the customers wanted a system that could “route” data to and from individual users in the most scalable, efficient way possible. And so began a non-traditional approach to building a storage system that no one had imagined before. No one thought an object store could have enough performance for all the files and attachments of millions of users. No one thought a system could remain up and running through software upgrades, hardware failures, capacity expansions, and even multiple hardware generations coexisting. And no one believed you could do all this and scale to petabytes of content and billions of objects in pure software.

Latest Stories
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal...
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
President Obama recently announced the launch of a new national awareness campaign to "encourage more Americans to move beyond passwords – adding an extra layer of security like a fingerprint or codes sent to your cellphone." The shift from single passwords to multi-factor authentication couldn’t be timelier or more strategic. This session will focus on why passwords alone are no longer effective, and why the time to act is now. In his session at 19th Cloud Expo, Chris Webber, security strateg...
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU's GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes. In...
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, provideed economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session also included a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
"Coalfire is a cyber-risk, security and compliance assessment and advisory services firm. We do a lot of work with the cloud service provider community," explained Ryan McGowan, Vice President, Sales (West) at Coalfire Systems, Inc., in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.