Welcome!

News Feed Item

2014 Poised to be the Year of the Hybrid Storage Array

NEWARK, Calif., Jan. 27, 2014 /PRNewswire/ -- With better performance than traditional hard disk-based storage at significantly less expense than all solid state arrays, hybrid storage provides organizations with the increasingly desired speed of flash media without sacrificing the capacity or cost advantage of spinning disks for a storage solution that is well positioned to be among the most implemented throughout the upcoming year, say experts at Tegile Systems, the leading provider of flash-driven storage arrays for virtualized server and virtual desktop environments.

(Logo: http://photos.prnewswire.com/prnh/20130215/LA61196LOGO)

Standard hard disk-based arrays offer capacity at a low cost but they cannot keep up with today's performance requirements.  Solid state disks deliver high performance but cannot deliver the required capacity at a reasonable cost.  Hybrid storage arrays are faster than hard disk drive-based arrays and less expensive than SSD-based arrays.  They balance high performance with high capacity and often come with a full suite of features at a price that makes them an ideal solution for a wide range of industries and applications.

News touting the gains in IOPS of all-flash arrays continue to garner headlines but the reality is that for 99 percent of organizations, the performance gain far exceeds the needs of their business-critical applications.  Those running specialized applications, such as real-time financial transactions, and those in the Big Data, HPC and scientific research environments may benefit from storage systems that can deliver 1 million IOPS, but most corporate business applications require only a fraction of this performance.  Hybrid arrays can service the vast majority of high-performance workloads for a fraction of the cost of all-flash systems.

This realization is being understood by both customers and those in the financial community, which has recently signaled its approval of hybrid systems through the overwhelming success of Nimble Storage's IPO and subsequent gains in the company's stock price.  Investors have signaled that hybrid storage is a good investment and they expect that there is room for growth in the sector.  The gains seen in Nimble's stock offerings are in stark contrast to the fall in stock price of all-flash array provider Violin Memory.

"The storage industry is at a bit of an inflection point in regards to the best approach organizations can take to satisfy the increasing needs of performance-hungry applications while staying true to the company's bottom line," said Rob Commins, Tegile vice president of marketing.  "Hybrid storage that marries the best of SSD and HDD is poised to make tremendous gains this year.  However, buyers still need to do the research as all hybrids are not created equal.  Truly effective hybrid array solutions need to be engineered to address all the demands of the modern data center rather than simply be a combination of two disparate technologies sold as a single offering."

Organizations looking to maximize their IT investment by adopting a hybrid approach to their storage infrastructure need to find solutions that deliver faster performance, higher capacities and robust data protection with near-instant recovery times while being affordable, efficient and easy to use.  Among the features users should look for in a hybrid solution are:

  • True unified access of both block protocols (iSCSI, Fibre Channel) and file protocols (NFS and CIFS) to provide flexibility in solving business challenges with a single storage platform.  This is more effective and less expensive than attempting to pull together multiple storage platforms to satisfy different storage requirements.
  • Optimized hybrid storage that intelligently uses SSDs with DRAM and flash in the data path to protect user data by storing it permanently on hard disks while using the faster technologies as a high-speed cache.  This is a superior approach to solutions that rely entirely on flash for data storage, risking data loss through volatility of the medium.
  • Integrated data protection functionality such as automated snapshot schedules for recovery points, fully redundant metadata storage and replication that is designed to survive multiple disk failures in order to ensure continued availability of business-critical data.
  • Inline compression and deduplication utilized throughout the array to reduce the acquisition and operational cost of storage.  Reducing all application data, not just secondary applications, can result in up to 75 percent less capacity used than alternative solutions.
  • Leveraging the benefits of SSDs throughout the data path, rather than as a tier of storage, to ensure every application receives a performance boost with flash storage.  Combined with intelligent, application-aware software, the storage system can configure and optimize itself based on the application that uses it.

"Tegile Systems has pioneered a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price to help organizations solve storage challenges in virtual server, database and VDI environments," said Commins.  "We believe this is the year that more users than ever will discover just how hybrid storage arrays can gain the performance benefits of solid state without sacrificing the cost benefits of hard disk storage at the right price point."

About Tegile Systems
Tegile Systems is pioneering a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price for virtualization, file services and database applications. With Tegile's Zebi line of hybrid storage arrays, the company is redefining the traditional approach to storage by providing a family of arrays that is significantly faster than all hard disk-based arrays and significantly less expensive than all solid-state disk-based arrays.

Tegile's patented MASS technology accelerates the Zebi's performance and enables on-the-fly de-duplication and compression of data so each Zebi has a usable capacity far greater than its raw capacity. Tegile's award-winning technology solutions enable customers to better address the requirements of server virtualization, virtual desktop integration and database integration than other offerings. Featuring both NAS and SAN connectivity, Tegile arrays are easy-to-use, fully redundant, and highly scalable. They come complete with built-in auto-snapshot, auto-replication, near-instant recovery, onsite or offsite failover, and virtualization management features. Additional information is available at www.tegile.com. Follow Tegile on Twitter @tegile.

MEDIA CONTACT:
Dan Miller, JPR Communications
818-884-8282, ext. 13

SOURCE Tegile Systems

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determ...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. ...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
You want to start your DevOps journey but where do you begin? Do you say DevOps loudly 5 times while looking in the mirror and it suddenly appears? Do you hire someone? Do you upskill your existing team? Here are some tips to help support your DevOps transformation. Conor Delanbanque has been involved with building & scaling teams in the DevOps space globally. He is the Head of DevOps Practice at MThree Consulting, a global technology consultancy. Conor founded the Future of DevOps Thought Leade...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...