Welcome!

News Feed Item

DataCore Announces Enterprise-Class Virtual SANs and Flash-Optimizing Stack in its Next Generation SANsymphony-V10 Software-Defined Storage Platform

Amidst the pent up demand for enterprise-grade virtual SANs and the need for cost-effective utilization of Flash technology, DataCore, a leader in software-defined storage, today revealed a new virtual SAN functionality and significant enhancements to its SANsymphony™-V10 software – the 10th generation release of its comprehensive storage services platform. The new release significantly advances virtual SAN capabilities designed to achieve the fastest performance, highest availability and optimal use from Flash and disk storage directly attached to application hosts and clustered servers in virtual (server-side) SAN use cases.

DataCore’s new Virtual SAN is a software-only solution that automates and simplifies storage management and provisioning while delivering enterprise-class functionality, automated recovery and significantly faster performance. It is easy to set up and runs on new or existing x86 servers where it creates a shared storage pool out of the internal Flash and disk storage resources available to that server. This means the DataCore™ Virtual SAN can be cost-effectively deployed as an overlay, without the need to make major investments in new hardware or complex SAN gear.

DataCore contrasts its enterprise-class virtual SAN offering with competing products which are:

  • Incapable of sustaining serious workloads and providing a growth path to physical SAN assets.
  • Inextricably tied to a specific server hypervisor, rendering them unusable in all but the smallest branch office environments or non-critical test and development scenarios.

The Ultimate Virtual SAN: Inexhaustible Performance, Continuous Availability, Large Scale

There is no compromise on performance, availability and scaling with DataCore. The new SANsymphony-V10 virtual SAN software scales performance to more than 50 Million IOPS and to 32 Petabytes of capacity across a cluster of 32 servers, making it one of the most powerful and scalable systems in the marketplace.

Enterprise-class availability comes standard with a DataCore virtual SAN; the software includes automated failover and failback recovery, and is able to span a N+1 grid (up to 32 nodes) stretching over metro-wide distances. With a DataCore virtual SAN, business continuity, remote site replication and data protection are simple and no hassle to implement, and best of all, once set, it is automatic thereafter.

DataCore SANsymphony-V10 also resolves mixed combinations of virtual and physical SANs and accounts for the likelihood that a virtual SAN may extend out into an external SAN – as the need for centralized storage services and hardware consolidation efficiencies are required initially or considered in later stages of the project. DataCore stands apart from the competition, in that it can run on the server-side as a virtual SAN, it can run and manage physical SANs and it can operate and federate across both. SANsymphony-V10 essentially provides a comprehensive growth path that amplifies the scope of the virtual SAN to non-disruptively incorporate external storage as part of an overall architecture.

A Compelling Solution for Expanding Enterprises

While larger environments will be drawn by SANsymphony-V10’s impressive specs, many customers have relatively modest requirements for their first virtual SAN. Typically they are looking to cost-effectively deploy fast ‘in memory’ technologies to speed up critical business applications, add resiliency and grow to integrate multiple systems over multiple sites, but have to live within limited commodity equipment budgets.

“We enable clients to get started with a high performance, stretchable and scalable virtual SAN at an appealing price, that takes full advantage of inexpensive servers and their internal drives,” said Paul Murphy, vice president of worldwide marketing at DataCore. “Competing alternatives mandate many clustered servers and require add-on flash cards to achieve a fraction of what DataCore delivers.”

DataCore virtual SANs are ideal solutions for clustered servers, VDI desktop deployments, remote disaster recovery and multi-site virtual server projects, as well as those demanding database and business application workloads running on server platforms. The software enables companies to create large scale and modular ‘Google-like’ infrastructures that leverage heterogeneous and commodity storage, servers and low-cost networking to transform them into enterprise-grade production architectures.

Virtual SANs and Flash: Comprehensive Software Stack is a ‘Must Have’ for Any Flash Deployment

SANsymphony-V10 delivers the industry’s most comprehensive set of features and services to manage, integrate and optimize Flash-based technology as part of your virtual SAN deployment or within an overall storage infrastructure. For example, SANsymphony-V10 self-tunes Flash and minimizes flash wear, and enables flash to be mirrored for high-availability even to non-Flash based devices for cost reduction. The software employs adaptive 'in-memory' caching technologies to speed up application workloads and optimize write traffic performance to complement Flash read performance. DataCore’s powerful auto-tiering feature works across different vendor platforms optimizing the use of new and existing investments of Flash and storage devices (up to 15 tiers). Other features such as metro-wide mirroring, snapshots and auto-recovery apply to the mix of Flash and disk devices equally well, enabling greater productivity, flexibility and cost-efficiency.

DataCore’s Universal End-to-End Services Platform Unifies ‘Isolated Storage Islands’

SANsymphony-V10 also continues to advance larger scale storage infrastructure management capabilities, cross-device automation and the capability to unify and federate ‘isolated storage islands.’

“It’s easy to see how IT organizations responding to specific projects could find themselves with several disjointed software stacks – one for virtual SANs for each server hypervisor and another set of stacks from each of their flash suppliers, which further complicates the handful of embedded stacks in each of their SAN arrays,” said IDC’s consulting director for storage, Nick Sundby. “DataCore treats each of these scenarios as use cases under its one, unifying software-defined storage platform, aiming to drive management and functional convergence across the enterprise.”

Additional Highlighted Features

The spotlight on SANsymphony-V10 is clearly on the new virtual SAN capabilities, and the new licensing and pricing choices. However, a number of other major performance and scalability enhancements appear in this version as well:

  • Scalability has doubled from 16 to 32 nodes; Enables Metro-wide N+1 grid data protection
  • Supports high-speed 40/56 GigE iSCSI; 16Gbps Fibre Channel; iSCSI Target NIC teaming
  • Performance visualization/Heat Map tools add insight into the behavior of Flash and disks
  • New auto-tiering settings optimize expensive resources (e.g., flash cards) in a pool
  • Intelligent disk rebalancing, dynamically redistributes load across available devices within a tier
  • Automated CPU load leveling and Flash optimizations to increase performance
  • Disk pool optimization and self-healing storage; Disk contents are automatically restored across the remaining storage in the pool; Enhancements to easily select and prioritize order of recovery
  • New self-tuning caching algorithms and optimizations for flash cards and SSDs
  • ‘Click-simple’ configuration wizards to rapidly set up different use cases (Virtual SAN; High-Availability SANs; NAS File Shares; etc.)

Pricing and Availability

Typical multi-node SANsymphony-V10 software licenses start in the $10,000 to $25,000 range. The new Virtual SAN pricing starts at $4,000 per server. The virtual SAN price includes auto-tiering, adaptive read/ write caching from DRAM, storage pooling, metro-wide synchronous mirroring, thin provisioning and snapshots. The software supports all the popular operating systems hosted on VMware ESX and Microsoft Hyper-V environments. Simple ‘Plug-ins’ for both VMware vSphere and Microsoft System Center are included to enable simplified hypervisor-based administration. SANsymphony-V10 and its virtual SAN variations may be deployed in a virtual machine or running natively on Windows Server 2012, using standard physical x86-64 servers.

General availability for SANsymphony-V10 is scheduled for May 30, 2014.

About DataCore

DataCore is a leader in software-defined storage. The company’s storage virtualization software empowers organizations to seamlessly manage and scale their data storage architectures, delivering massive performance gains at a fraction of the cost of solutions offered by legacy storage hardware vendors. Backed by 10,000 customer sites around the world, DataCore’s adaptive and self-learning and healing technology takes the pain out of manual processes and helps deliver on the promise of the new software defined data center through its hardware agnostic architecture. Visit http://www.datacore.com or call (877) 780-5111 for more information.

DataCore, the DataCore logo and SANsymphony are trademarks or registered trademarks of DataCore Software Corporation. Other DataCore product or service names or logos referenced herein are trademarks of DataCore Software Corporation. All other products, services and company names mentioned herein may be trademarks of their respective owners.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
"We focus on composable infrastructure. Composable infrastructure has been named by companies like Gartner as the evolution of the IT infrastructure where everything is now driven by software," explained Bruno Andrade, CEO and Founder of HTBase, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
SYS-CON Events announced today that Datanami has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datanami is a communication channel dedicated to providing insight, analysis and up-to-the-minute information about emerging trends and solutions in Big Data. The publication sheds light on all cutting-edge technologies including networking, storage and applications, and thei...