Welcome!

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, Agile Computing, @CloudExpo, SDN Journal

Containers Expo Blog: Article

The Future Is Now: Why Flash Storage Will Transform the Data Center

By using software-defined storage, data center architects can design a flexible, efficient and powerful framework

For an example of just how dramatically storage has changed over the past fifteen years, consider your music collection. At one point, you had a collection of cassettes that stored the song files on tape. As the years went on, your hairstyle changed and you bought a CD player that used a spinning disk to store more song data at a higher quality than tape could. Spinning disks flourished well into the MP3 player era, surviving even the initial introduction of flash storage due to its competitive cost. Eventually, however, your newest smartphone or iPod shipped with flash storage instead, as manufacturers bowed to its improved performance over disk storage and its increasingly competitive price point.

This is an example of a sea change taking place at a much bigger scale as well. Instead of gigabytes, think petabytes.

The data center infrastructures designed by telcos, service providers and major enterprises to store massive quantities of data have lately used predominantly disk storage in their servers, sometimes blending in flash storage for performance-intensive tasks. While the speed and performance of flash storage has tempted data center architects to deploy it more widely throughout the data center, it has only been recently that the price of flash has decreased enough to make its broader use a viable option.

To understand why flash storage has suddenly become a practical choice for data center architects across industries, it is helpful to examine the differences between flash and disk storage.

The Next Big Thing, Again
As the example above shows, when it was introduced, disk storage represented leaps and bounds of progress in speed and efficiency compared to tape storage, the predominant method of the time. Even after flash was introduced to the market, disk storage remained the server architecture of choice. Flash did deliver substantially higher performance, but was priced too high to ever present a real threat to the prevalence of spinning disks. In addition, flash drives were smaller in capacity and not able to store as much data per unit as spinning disks at the same value.

However, new improvements in flash have slashed its price significantly, positioning it as a true data center hardware alternative whose benefits - speed in throughput and latency - have dramatically increased at the same time. As an added plus, flash is highly energy efficient, needing only a fraction of the power needed by disk storage, sometimes at the ratio of one to 16. Flash drives still break down at a faster rate than does disk storage, but its boosts in performance and drop in price in recent years have made flash a realistic and highly attractive option for data center architecture and design needs.

Making the Switch
In fact, it's increasingly feasible that today's data center - still reliant on disk storage - could use 100 percent flash storage tomorrow. Telcos, service providers, major enterprises and other major companies whose profits are tied to the speed and availability they can provide to their customer base, are beginning to look at flash storage's blistering performance as less of a "nice to have" option and more of a core technology necessary to maintaining a competitive edge.

While the high-performance-demanding industries of telco and service providers are diving into flash straight away, vendors in other vertical markets have made cost-benefit calculations and have elected to hold back until the price of flash storage drops even further. For example, a Dropbox-style file hosting service for consumer cloud storage isn't as likely to be motivated by fast performance as it would be with ensuring the availability of cheap storage at scale. Companies like these are making the usual tradeoff in storage: finding a comfortable place between price and capacity. However, when the price of flash finally descends to that of disk storage, the last barrier will be removed for those companies that want to remain competitive. When this last milestone finally happens, the market shift will be as significant as when disks replaced tape storage by beating it on the same markers: higher performance and better pricing.

Advancements in Software
One of the trends making this shift possible is that of software-defined storage. By adopting a software-defined approach to storage infrastructure, organizations have the flexibility to deploy flash storage throughout their data center architectures quickly and easily.

As background, the concept of software-defined storage seeks to move functions and features from the hardware layer to the software layer. This approach removes the dependence on expensive and annoying redundancies that solve issues based in the hardware layer. Data center architects must also plan for the inevitable failure of hardware. Flash storage, in particular, currently has a faster time-to-failure rate than disk does. In storage environments that don't use RAID cards, the failure of a disk prompts an error that will impact the end-user's experience. To solve this, architects will build in expensive and redundant RAID cards to hide the errors. By using the right software-defined strategy, these problems can be absorbed and made invisible to the end user. Since software-defined storage is hardware-agnostic, it can run on any hardware configuration.

There are a number of additional benefits that telcos and service provider data center architects can achieve by combining software-defined storage with flash hardware. For instance, the organization could still utilize a single name space spanning all its storage nodes if it were to use a software-defined storage approach. In addition, it could also run applications in the storage nodes as well, creating new "compustorage" nodes instead. As a result, the storage hardware wouldn't need to be big or costly, but could still have very high performance and speed. Organizations can start with a small number of cheap servers instead of building a large, expensive and traditional installation, and still scale linearly as needed.

Flash Assets
Benefits of a software-defined approach to an all-flash data center are:

  • Huge performance improvement through the ability to use the faster flash technology throughout the data center.
  • Lower power consumption means that SSDs reduce running costs, generating far less heat than a spinning disk and requiring less energy for cooling.
  • SSDs deliver a smaller footprint in the data center. Since SSDs are much smaller than spinning disks, they require less space and less real estate to house them.
  • Running more applications on the same hardware, due to hardware performance gains.

Conclusion
Even as many of us still listen to CDs in the car, the music industry is inevitably shifting to a new paradigm built on music files saved on flash storage. The trend is repeating across industries, but nowhere as dramatically as it is in the data center. Flash storage - with its extreme performance, efficient energy usage and increasingly competitive cost - will eventually become the industry status quo. By using software-defined storage, data center architects can design a flexible, efficient and powerful framework for telcos, service providers and major enterprises looking to get the most powerful and energy-efficient data center possible by using all flash.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU's GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes. In...
"Coalfire is a cyber-risk, security and compliance assessment and advisory services firm. We do a lot of work with the cloud service provider community," explained Ryan McGowan, Vice President, Sales (West) at Coalfire Systems, Inc., in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
CloudJumper, a Workspace as a Service (WaaS) platform innovator for agile business IT, has been recognized with the Customer Value Leadership Award for its nWorkSpace platform by Frost & Sullivan. The company was also featured in a new report(1) by the industry research firm titled, “Desktop-as-a-Service Buyer’s Guide, 2016,” which provides a comprehensive comparison of DaaS providers, including CloudJumper, Amazon, VMware, and Microsoft.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...