Welcome!

News Feed Item

Tegile Systems Combines FlashVols with Deduplication, Compression to Maximize Performance of SQL Server Deployments

NEWARK, Calif., Feb. 4, 2014 /PRNewswire/ -- Tegile Systems, the leading provider of flash-driven storage arrays for virtualized server and virtual desktop environments, today announced the addition of FlashVols to its Metadata Accelerated Storage System (MASS) architecture to optimize SQL performance and integration.

(Logo:  http://photos.prnewswire.com/prnh/20130215/LA61196LOGO)

FlashVols are volumes that are pinned in SSD so applications run at maximum performance without the potential delay due to caching algorithms or tiering policies.  Combined with Tegile's de-duplication and compression technologies, this enhancement allows large and medium enterprises that need high-performance and high-availability storage to benefit from flash-grade performance without incurring the costs traditionally associated with high RPM disk drives.

"DBAs spend a lot of time looking at storage latency.  During our Microsoft FastTrack Data Warehouse testing, we saw latencies in the 1-10 millisecond range which is in line with what one would expect from an all-flash array," said Larry Chestnut, Senior Architect with Scalability Experts in an interview with Microsoft SQL Server MVP Rick Heigis.  "Tegile's eMLC NAND flash implementation is very durable as well.  Each SSD drive in the system can withstand 3.5 petabytes of write data before you may start to see signs of write wear."  The system in the Microsoft FastTrack tests was configured with 10 SSD drives, yielding an aggregate write duty cycle of 35 petabytes.

Tegile Zebi hybrid storage arrays simplify the deployment, configuration and management of SQL Server databases by delivering consistently low latencies while providing the right amount of IOPs for appropriate workloads.  Featuring a pool of DRAM, flash and optimized disks, Tegile's Zebi arrays deliver significant acceleration for most Data Warehouse and Business Intelligence workloads when compared to traditional disk arrays or tiered solutions.  Pinned volumes remain in DRAM or flash close to the SQL Server without any involvement from the DBA administrator for ultimate ease of use.

In conjunction with Tegile's VSS provider, organizations can use the Zebi array to present multiple copies of the same data from a single source.  When copies are updated, only deltas are saved.  Using this feature can result in dramatic savings when dealing with testing and development environments that utilize multiple instances of the same database.  Inline deduplication and compression reduce overhead for highly repetitive SQL Server workloads.

"We've engaged with Microsoft to optimize performance and reduce the cost of SQL Server storage by eliminating the need to create multiple copies of the same data necessary to deliver the required IOPs, saving time and capacity requirements in test and development and database environments," said Rob Commins, Tegile vice president of marketing.  "These savings, in conjunction with deduplication and compression, help organizations ensure that flash resources are better utilized and more efficient.  Additionally, SQL workloads are automatically optimized by the MASS architecture, ensuring that administrators can focus on their database environments rather than their storage infrastructure."

About Tegile Systems
Tegile Systems is pioneering a new generation of flash-driven enterprise storage arrays that balance performance, capacity, features and price for virtualization, file services and database applications. With Tegile's Zebi line of hybrid storage arrays, the company is redefining the traditional approach to storage by providing a family of arrays that is significantly faster than all hard disk-based arrays and significantly less expensive than all solid-state disk-based arrays.

Tegile's patented MASS technology accelerates the Zebi's performance and enables on-the-fly de-duplication and compression of data so each Zebi has a usable capacity far greater than its raw capacity. Tegile's award-winning technology solutions enable customers to better address the requirements of server virtualization, virtual desktop integration and database integration than other offerings. Featuring both NAS and SAN connectivity, Tegile arrays are easy-to-use, fully redundant, and highly scalable. They come complete with built-in auto-snapshot, auto-replication, near-instant recovery, onsite or offsite failover, and virtualization management features. Additional information is available at www.tegile.com. Follow Tegile on Twitter @tegile.

MEDIA CONTACT:
Dan Miller, JPR Communications
818-884-8282, ext. 13

SOURCE Tegile Systems

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...