Welcome!

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, Agile Computing, @CloudExpo, SDN Journal

Containers Expo Blog: Article

The Future Is Now: Why Flash Storage Will Transform the Data Center

By using software-defined storage, data center architects can design a flexible, efficient and powerful framework

For an example of just how dramatically storage has changed over the past fifteen years, consider your music collection. At one point, you had a collection of cassettes that stored the song files on tape. As the years went on, your hairstyle changed and you bought a CD player that used a spinning disk to store more song data at a higher quality than tape could. Spinning disks flourished well into the MP3 player era, surviving even the initial introduction of flash storage due to its competitive cost. Eventually, however, your newest smartphone or iPod shipped with flash storage instead, as manufacturers bowed to its improved performance over disk storage and its increasingly competitive price point.

This is an example of a sea change taking place at a much bigger scale as well. Instead of gigabytes, think petabytes.

The data center infrastructures designed by telcos, service providers and major enterprises to store massive quantities of data have lately used predominantly disk storage in their servers, sometimes blending in flash storage for performance-intensive tasks. While the speed and performance of flash storage has tempted data center architects to deploy it more widely throughout the data center, it has only been recently that the price of flash has decreased enough to make its broader use a viable option.

To understand why flash storage has suddenly become a practical choice for data center architects across industries, it is helpful to examine the differences between flash and disk storage.

The Next Big Thing, Again
As the example above shows, when it was introduced, disk storage represented leaps and bounds of progress in speed and efficiency compared to tape storage, the predominant method of the time. Even after flash was introduced to the market, disk storage remained the server architecture of choice. Flash did deliver substantially higher performance, but was priced too high to ever present a real threat to the prevalence of spinning disks. In addition, flash drives were smaller in capacity and not able to store as much data per unit as spinning disks at the same value.

However, new improvements in flash have slashed its price significantly, positioning it as a true data center hardware alternative whose benefits - speed in throughput and latency - have dramatically increased at the same time. As an added plus, flash is highly energy efficient, needing only a fraction of the power needed by disk storage, sometimes at the ratio of one to 16. Flash drives still break down at a faster rate than does disk storage, but its boosts in performance and drop in price in recent years have made flash a realistic and highly attractive option for data center architecture and design needs.

Making the Switch
In fact, it's increasingly feasible that today's data center - still reliant on disk storage - could use 100 percent flash storage tomorrow. Telcos, service providers, major enterprises and other major companies whose profits are tied to the speed and availability they can provide to their customer base, are beginning to look at flash storage's blistering performance as less of a "nice to have" option and more of a core technology necessary to maintaining a competitive edge.

While the high-performance-demanding industries of telco and service providers are diving into flash straight away, vendors in other vertical markets have made cost-benefit calculations and have elected to hold back until the price of flash storage drops even further. For example, a Dropbox-style file hosting service for consumer cloud storage isn't as likely to be motivated by fast performance as it would be with ensuring the availability of cheap storage at scale. Companies like these are making the usual tradeoff in storage: finding a comfortable place between price and capacity. However, when the price of flash finally descends to that of disk storage, the last barrier will be removed for those companies that want to remain competitive. When this last milestone finally happens, the market shift will be as significant as when disks replaced tape storage by beating it on the same markers: higher performance and better pricing.

Advancements in Software
One of the trends making this shift possible is that of software-defined storage. By adopting a software-defined approach to storage infrastructure, organizations have the flexibility to deploy flash storage throughout their data center architectures quickly and easily.

As background, the concept of software-defined storage seeks to move functions and features from the hardware layer to the software layer. This approach removes the dependence on expensive and annoying redundancies that solve issues based in the hardware layer. Data center architects must also plan for the inevitable failure of hardware. Flash storage, in particular, currently has a faster time-to-failure rate than disk does. In storage environments that don't use RAID cards, the failure of a disk prompts an error that will impact the end-user's experience. To solve this, architects will build in expensive and redundant RAID cards to hide the errors. By using the right software-defined strategy, these problems can be absorbed and made invisible to the end user. Since software-defined storage is hardware-agnostic, it can run on any hardware configuration.

There are a number of additional benefits that telcos and service provider data center architects can achieve by combining software-defined storage with flash hardware. For instance, the organization could still utilize a single name space spanning all its storage nodes if it were to use a software-defined storage approach. In addition, it could also run applications in the storage nodes as well, creating new "compustorage" nodes instead. As a result, the storage hardware wouldn't need to be big or costly, but could still have very high performance and speed. Organizations can start with a small number of cheap servers instead of building a large, expensive and traditional installation, and still scale linearly as needed.

Flash Assets
Benefits of a software-defined approach to an all-flash data center are:

  • Huge performance improvement through the ability to use the faster flash technology throughout the data center.
  • Lower power consumption means that SSDs reduce running costs, generating far less heat than a spinning disk and requiring less energy for cooling.
  • SSDs deliver a smaller footprint in the data center. Since SSDs are much smaller than spinning disks, they require less space and less real estate to house them.
  • Running more applications on the same hardware, due to hardware performance gains.

Conclusion
Even as many of us still listen to CDs in the car, the music industry is inevitably shifting to a new paradigm built on music files saved on flash storage. The trend is repeating across industries, but nowhere as dramatically as it is in the data center. Flash storage - with its extreme performance, efficient energy usage and increasingly competitive cost - will eventually become the industry status quo. By using software-defined storage, data center architects can design a flexible, efficient and powerful framework for telcos, service providers and major enterprises looking to get the most powerful and energy-efficient data center possible by using all flash.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
"There is a huge interest in Kubernetes. People are now starting to use Kubernetes and implement it," stated Sebastian Scheele, co-founder of Loodse, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Providing secure, mobile access to sensitive data sets is a critical element in realizing the full potential of cloud computing. However, large data caches remain inaccessible to edge devices for reasons of security, size, format or limited viewing capabilities. Medical imaging, computer aided design and seismic interpretation are just a few examples of industries facing this challenge. Rather than fighting for incremental gains by pulling these datasets to edge devices, we need to embrace the i...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walked through how Octob...