|By Stefan Bernbo||
|April 4, 2014 12:30 PM EDT||
For an example of just how dramatically storage has changed over the past fifteen years, consider your music collection. At one point, you had a collection of cassettes that stored the song files on tape. As the years went on, your hairstyle changed and you bought a CD player that used a spinning disk to store more song data at a higher quality than tape could. Spinning disks flourished well into the MP3 player era, surviving even the initial introduction of flash storage due to its competitive cost. Eventually, however, your newest smartphone or iPod shipped with flash storage instead, as manufacturers bowed to its improved performance over disk storage and its increasingly competitive price point.
This is an example of a sea change taking place at a much bigger scale as well. Instead of gigabytes, think petabytes.
The data center infrastructures designed by telcos, service providers and major enterprises to store massive quantities of data have lately used predominantly disk storage in their servers, sometimes blending in flash storage for performance-intensive tasks. While the speed and performance of flash storage has tempted data center architects to deploy it more widely throughout the data center, it has only been recently that the price of flash has decreased enough to make its broader use a viable option.
To understand why flash storage has suddenly become a practical choice for data center architects across industries, it is helpful to examine the differences between flash and disk storage.
The Next Big Thing, Again
As the example above shows, when it was introduced, disk storage represented leaps and bounds of progress in speed and efficiency compared to tape storage, the predominant method of the time. Even after flash was introduced to the market, disk storage remained the server architecture of choice. Flash did deliver substantially higher performance, but was priced too high to ever present a real threat to the prevalence of spinning disks. In addition, flash drives were smaller in capacity and not able to store as much data per unit as spinning disks at the same value.
However, new improvements in flash have slashed its price significantly, positioning it as a true data center hardware alternative whose benefits - speed in throughput and latency - have dramatically increased at the same time. As an added plus, flash is highly energy efficient, needing only a fraction of the power needed by disk storage, sometimes at the ratio of one to 16. Flash drives still break down at a faster rate than does disk storage, but its boosts in performance and drop in price in recent years have made flash a realistic and highly attractive option for data center architecture and design needs.
Making the Switch
In fact, it's increasingly feasible that today's data center - still reliant on disk storage - could use 100 percent flash storage tomorrow. Telcos, service providers, major enterprises and other major companies whose profits are tied to the speed and availability they can provide to their customer base, are beginning to look at flash storage's blistering performance as less of a "nice to have" option and more of a core technology necessary to maintaining a competitive edge.
While the high-performance-demanding industries of telco and service providers are diving into flash straight away, vendors in other vertical markets have made cost-benefit calculations and have elected to hold back until the price of flash storage drops even further. For example, a Dropbox-style file hosting service for consumer cloud storage isn't as likely to be motivated by fast performance as it would be with ensuring the availability of cheap storage at scale. Companies like these are making the usual tradeoff in storage: finding a comfortable place between price and capacity. However, when the price of flash finally descends to that of disk storage, the last barrier will be removed for those companies that want to remain competitive. When this last milestone finally happens, the market shift will be as significant as when disks replaced tape storage by beating it on the same markers: higher performance and better pricing.
Advancements in Software
One of the trends making this shift possible is that of software-defined storage. By adopting a software-defined approach to storage infrastructure, organizations have the flexibility to deploy flash storage throughout their data center architectures quickly and easily.
As background, the concept of software-defined storage seeks to move functions and features from the hardware layer to the software layer. This approach removes the dependence on expensive and annoying redundancies that solve issues based in the hardware layer. Data center architects must also plan for the inevitable failure of hardware. Flash storage, in particular, currently has a faster time-to-failure rate than disk does. In storage environments that don't use RAID cards, the failure of a disk prompts an error that will impact the end-user's experience. To solve this, architects will build in expensive and redundant RAID cards to hide the errors. By using the right software-defined strategy, these problems can be absorbed and made invisible to the end user. Since software-defined storage is hardware-agnostic, it can run on any hardware configuration.
There are a number of additional benefits that telcos and service provider data center architects can achieve by combining software-defined storage with flash hardware. For instance, the organization could still utilize a single name space spanning all its storage nodes if it were to use a software-defined storage approach. In addition, it could also run applications in the storage nodes as well, creating new "compustorage" nodes instead. As a result, the storage hardware wouldn't need to be big or costly, but could still have very high performance and speed. Organizations can start with a small number of cheap servers instead of building a large, expensive and traditional installation, and still scale linearly as needed.
Benefits of a software-defined approach to an all-flash data center are:
- Huge performance improvement through the ability to use the faster flash technology throughout the data center.
- Lower power consumption means that SSDs reduce running costs, generating far less heat than a spinning disk and requiring less energy for cooling.
- SSDs deliver a smaller footprint in the data center. Since SSDs are much smaller than spinning disks, they require less space and less real estate to house them.
- Running more applications on the same hardware, due to hardware performance gains.
Even as many of us still listen to CDs in the car, the music industry is inevitably shifting to a new paradigm built on music files saved on flash storage. The trend is repeating across industries, but nowhere as dramatically as it is in the data center. Flash storage - with its extreme performance, efficient energy usage and increasingly competitive cost - will eventually become the industry status quo. By using software-defined storage, data center architects can design a flexible, efficient and powerful framework for telcos, service providers and major enterprises looking to get the most powerful and energy-efficient data center possible by using all flash.
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Jul. 25, 2016 02:00 PM EDT Reads: 933
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 25, 2016 01:15 PM EDT Reads: 1,895
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Jul. 25, 2016 01:00 PM EDT Reads: 1,924
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Jul. 25, 2016 12:15 PM EDT Reads: 409
“Being the one true cloud-agnostic and storage-agnostic software solution, more and more customers are coming to Commvault and saying ' What do you recommend? What's your best practice for implementing cloud?” explained Randy De Meno, Chief Technologist at Commvault, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 12:00 PM EDT Reads: 1,663
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 25, 2016 12:00 PM EDT Reads: 1,813
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jul. 25, 2016 11:45 AM EDT Reads: 863
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Jul. 25, 2016 11:00 AM EDT Reads: 1,574
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 25, 2016 11:00 AM EDT Reads: 898
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Jul. 25, 2016 10:53 AM EDT Reads: 325
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jul. 25, 2016 10:30 AM EDT Reads: 1,938
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
Jul. 25, 2016 10:30 AM EDT Reads: 294
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 10:15 AM EDT Reads: 1,865
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
Jul. 25, 2016 10:00 AM EDT Reads: 985
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
Jul. 25, 2016 09:45 AM EDT Reads: 711