Welcome!

News Feed Item

2014 Poised to be the Year of the Hybrid Storage Array

NEWARK, Calif., Jan. 27, 2014 /PRNewswire/ -- With better performance than traditional hard disk-based storage at significantly less expense than all solid state arrays, hybrid storage provides organizations with the increasingly desired speed of flash media without sacrificing the capacity or cost advantage of spinning disks for a storage solution that is well positioned to be among the most implemented throughout the upcoming year, say experts at Tegile Systems, the leading provider of flash-driven storage arrays for virtualized server and virtual desktop environments.

(Logo: http://photos.prnewswire.com/prnh/20130215/LA61196LOGO)

Standard hard disk-based arrays offer capacity at a low cost but they cannot keep up with today's performance requirements.  Solid state disks deliver high performance but cannot deliver the required capacity at a reasonable cost.  Hybrid storage arrays are faster than hard disk drive-based arrays and less expensive than SSD-based arrays.  They balance high performance with high capacity and often come with a full suite of features at a price that makes them an ideal solution for a wide range of industries and applications.

News touting the gains in IOPS of all-flash arrays continue to garner headlines but the reality is that for 99 percent of organizations, the performance gain far exceeds the needs of their business-critical applications.  Those running specialized applications, such as real-time financial transactions, and those in the Big Data, HPC and scientific research environments may benefit from storage systems that can deliver 1 million IOPS, but most corporate business applications require only a fraction of this performance.  Hybrid arrays can service the vast majority of high-performance workloads for a fraction of the cost of all-flash systems.

This realization is being understood by both customers and those in the financial community, which has recently signaled its approval of hybrid systems through the overwhelming success of Nimble Storage's IPO and subsequent gains in the company's stock price.  Investors have signaled that hybrid storage is a good investment and they expect that there is room for growth in the sector.  The gains seen in Nimble's stock offerings are in stark contrast to the fall in stock price of all-flash array provider Violin Memory.

"The storage industry is at a bit of an inflection point in regards to the best approach organizations can take to satisfy the increasing needs of performance-hungry applications while staying true to the company's bottom line," said Rob Commins, Tegile vice president of marketing.  "Hybrid storage that marries the best of SSD and HDD is poised to make tremendous gains this year.  However, buyers still need to do the research as all hybrids are not created equal.  Truly effective hybrid array solutions need to be engineered to address all the demands of the modern data center rather than simply be a combination of two disparate technologies sold as a single offering."

Organizations looking to maximize their IT investment by adopting a hybrid approach to their storage infrastructure need to find solutions that deliver faster performance, higher capacities and robust data protection with near-instant recovery times while being affordable, efficient and easy to use.  Among the features users should look for in a hybrid solution are:

  • True unified access of both block protocols (iSCSI, Fibre Channel) and file protocols (NFS and CIFS) to provide flexibility in solving business challenges with a single storage platform.  This is more effective and less expensive than attempting to pull together multiple storage platforms to satisfy different storage requirements.
  • Optimized hybrid storage that intelligently uses SSDs with DRAM and flash in the data path to protect user data by storing it permanently on hard disks while using the faster technologies as a high-speed cache.  This is a superior approach to solutions that rely entirely on flash for data storage, risking data loss through volatility of the medium.
  • Integrated data protection functionality such as automated snapshot schedules for recovery points, fully redundant metadata storage and replication that is designed to survive multiple disk failures in order to ensure continued availability of business-critical data.
  • Inline compression and deduplication utilized throughout the array to reduce the acquisition and operational cost of storage.  Reducing all application data, not just secondary applications, can result in up to 75 percent less capacity used than alternative solutions.
  • Leveraging the benefits of SSDs throughout the data path, rather than as a tier of storage, to ensure every application receives a performance boost with flash storage.  Combined with intelligent, application-aware software, the storage system can configure and optimize itself based on the application that uses it.

"Tegile Systems has pioneered a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price to help organizations solve storage challenges in virtual server, database and VDI environments," said Commins.  "We believe this is the year that more users than ever will discover just how hybrid storage arrays can gain the performance benefits of solid state without sacrificing the cost benefits of hard disk storage at the right price point."

About Tegile Systems
Tegile Systems is pioneering a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price for virtualization, file services and database applications. With Tegile's Zebi line of hybrid storage arrays, the company is redefining the traditional approach to storage by providing a family of arrays that is significantly faster than all hard disk-based arrays and significantly less expensive than all solid-state disk-based arrays.

Tegile's patented MASS technology accelerates the Zebi's performance and enables on-the-fly de-duplication and compression of data so each Zebi has a usable capacity far greater than its raw capacity. Tegile's award-winning technology solutions enable customers to better address the requirements of server virtualization, virtual desktop integration and database integration than other offerings. Featuring both NAS and SAN connectivity, Tegile arrays are easy-to-use, fully redundant, and highly scalable. They come complete with built-in auto-snapshot, auto-replication, near-instant recovery, onsite or offsite failover, and virtualization management features. Additional information is available at www.tegile.com. Follow Tegile on Twitter @tegile.

MEDIA CONTACT:
Dan Miller, JPR Communications
818-884-8282, ext. 13

SOURCE Tegile Systems

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...