Welcome!

Related Topics: Containers Expo Blog, Java IoT, @CloudExpo

Containers Expo Blog: Article

New Storage Solutions: Hyperconverged and Hyperscale | @CloudExpo #Cloud

Enterprises today need flexible, scalable storage approaches if they hope to keep up with rising data demands

Mobile devices. Cloud-based services. The Internet of Things. What do all of these trends have in common? They are some of the factors driving the unprecedented growth of data today. And where data grows, so does the need for data storage. The traditional method of buying more hardware is cost-prohibitive at the scale needed. As a result, a new storage paradigm is required.

Enterprises today need flexible, scalable storage approaches if they hope to keep up with rising data demands. Software-defined storage (SDS) offers the needed flexibility. In light of the varied storage and compute needs of organizations, two SDS options have arisen: hyperconverged and hyperscale. Each approach has its distinctive features and benefits, which are explored below.

Making Important Distinctions
To appreciate these new SDS options, it is helpful to look back at what came before them. Converged storage combines storage and computing hardware to increase delivery time and minimize the physical space required in virtualized and cloud-based environments. This was an improvement over the traditional storage approach, where storage and compute functions were housed in separate hardware. The goal was to improve data storage and retrieval and to speed the delivery of applications to and from clients.

A "building block" model is the basis of converged storage infrastructure. That is, it uses a hardware-based approach comprised of discrete components, each of which can be used on its own for its original purpose. Converged storage is not centrally managed and does not run on hypervisors; the storage is attached directly to the physical servers.

However, a software-defined approach is the foundation of hyperconverged storage infrastructure. All components are converged at the software level and cannot be separated out. This model is centrally managed and virtual machine-based. The storage controller and array are deployed on the same server, and compute and storage are scaled together. Each node has compute and storage capabilities. Data can be stored locally or on another server, depending on how often that data is needed.

Today's data demands require agility and flexibility, and that is exactly what hyperconverged storage offers. It also promotes cost savings. Organizations are able to use commodity servers, since software-defined storage works by taking features typically found in hardware and moving them to the software layer. Organizations that need more 1:1 scaling would use the hyperconverged approach, and those that deploy VDI environments. The hyperconverged model is useful in many business scenarios, functioning as a building block that works exactly the same; it's just a question of how many building blocks a data center needs.

If hyperconverged storage is so flexible and efficient, why would anyone need hyperscale storage? It's a new storage approach created to address differing storage needs. Hyperscale computing is a distributed computing environment in which the storage controller and array are separated. As its name implies, hyperscale is the ability of an architecture to scale quickly as greater demands are made on the system. This kind of scalability is required in order to build Big Data or cloud systems. It's what Internet giants like Amazon and Google use to meet their vast storage demands. However, software-defined storage now enables many enterprises to enjoy the benefits of hyperscale.

For instance, organizations that choose hyperscale storage can reduce their total cost of ownership. That's because commodity servers are typically used and a data center can have millions of virtual servers without the added expense that this number of physical servers would require. Data center managers want to get rid of refrigerator-sized disk shelves that use NAS and SAN solutions, which are difficult to scale and very expensive. With hyper solutions, it is easy to start small and scale up as needed. Using standard servers in a hyper setup creates a flattened architecture. Less hardware needs to be bought, and it's less expensive. Hyperscale enables organizations to buy commodity hardware. Hyperconverged goes one step further by running both elements - compute and storage - in the same commodity hardware. It becomes a question of how many servers are necessary.

The Best of Both Worlds
In a hyperconverged approach, there is basically one box with everything in it; hyperscale has two sets of boxes, one set of storage boxes and one set of compute boxes. It just depends what the architect wants to do, according to the needs of the business. A software-defined storage solution would take over all the hardware and turn it into a type of appliance, or it could be run as a virtual machine - which would make it a hyperconverged configuration.

In light of the exponential increase in storage demand, it is comforting to know that data center architects don't have to choose between one or the other of these solutions. These architectures can be combined to accommodate specific needs at specific times. Storage needs are fluid, which makes flexible storage solutions ideal. In addition, hyper solutions save money by not requiring expensive hardware. Hyperconverged and hyperscale storage approaches represent the best of both worlds.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"WineSOFT is a software company making proxy server software, which is widely used in the telecommunication industry or the content delivery networks or e-commerce," explained Jonathan Ahn, COO of WineSOFT, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.