Welcome!

News Feed Item

DataCore Announces Enterprise-Class Virtual SANs and Flash-Optimizing Stack in its Next Generation SANsymphony-V10 Software-Defined Storage Platform

Amidst the pent up demand for enterprise-grade virtual SANs and the need for cost-effective utilization of Flash technology, DataCore, a leader in software-defined storage, today revealed a new virtual SAN functionality and significant enhancements to its SANsymphony™-V10 software – the 10th generation release of its comprehensive storage services platform. The new release significantly advances virtual SAN capabilities designed to achieve the fastest performance, highest availability and optimal use from Flash and disk storage directly attached to application hosts and clustered servers in virtual (server-side) SAN use cases.

DataCore’s new Virtual SAN is a software-only solution that automates and simplifies storage management and provisioning while delivering enterprise-class functionality, automated recovery and significantly faster performance. It is easy to set up and runs on new or existing x86 servers where it creates a shared storage pool out of the internal Flash and disk storage resources available to that server. This means the DataCore™ Virtual SAN can be cost-effectively deployed as an overlay, without the need to make major investments in new hardware or complex SAN gear.

DataCore contrasts its enterprise-class virtual SAN offering with competing products which are:

  • Incapable of sustaining serious workloads and providing a growth path to physical SAN assets.
  • Inextricably tied to a specific server hypervisor, rendering them unusable in all but the smallest branch office environments or non-critical test and development scenarios.

The Ultimate Virtual SAN: Inexhaustible Performance, Continuous Availability, Large Scale

There is no compromise on performance, availability and scaling with DataCore. The new SANsymphony-V10 virtual SAN software scales performance to more than 50 Million IOPS and to 32 Petabytes of capacity across a cluster of 32 servers, making it one of the most powerful and scalable systems in the marketplace.

Enterprise-class availability comes standard with a DataCore virtual SAN; the software includes automated failover and failback recovery, and is able to span a N+1 grid (up to 32 nodes) stretching over metro-wide distances. With a DataCore virtual SAN, business continuity, remote site replication and data protection are simple and no hassle to implement, and best of all, once set, it is automatic thereafter.

DataCore SANsymphony-V10 also resolves mixed combinations of virtual and physical SANs and accounts for the likelihood that a virtual SAN may extend out into an external SAN – as the need for centralized storage services and hardware consolidation efficiencies are required initially or considered in later stages of the project. DataCore stands apart from the competition, in that it can run on the server-side as a virtual SAN, it can run and manage physical SANs and it can operate and federate across both. SANsymphony-V10 essentially provides a comprehensive growth path that amplifies the scope of the virtual SAN to non-disruptively incorporate external storage as part of an overall architecture.

A Compelling Solution for Expanding Enterprises

While larger environments will be drawn by SANsymphony-V10’s impressive specs, many customers have relatively modest requirements for their first virtual SAN. Typically they are looking to cost-effectively deploy fast ‘in memory’ technologies to speed up critical business applications, add resiliency and grow to integrate multiple systems over multiple sites, but have to live within limited commodity equipment budgets.

“We enable clients to get started with a high performance, stretchable and scalable virtual SAN at an appealing price, that takes full advantage of inexpensive servers and their internal drives,” said Paul Murphy, vice president of worldwide marketing at DataCore. “Competing alternatives mandate many clustered servers and require add-on flash cards to achieve a fraction of what DataCore delivers.”

DataCore virtual SANs are ideal solutions for clustered servers, VDI desktop deployments, remote disaster recovery and multi-site virtual server projects, as well as those demanding database and business application workloads running on server platforms. The software enables companies to create large scale and modular ‘Google-like’ infrastructures that leverage heterogeneous and commodity storage, servers and low-cost networking to transform them into enterprise-grade production architectures.

Virtual SANs and Flash: Comprehensive Software Stack is a ‘Must Have’ for Any Flash Deployment

SANsymphony-V10 delivers the industry’s most comprehensive set of features and services to manage, integrate and optimize Flash-based technology as part of your virtual SAN deployment or within an overall storage infrastructure. For example, SANsymphony-V10 self-tunes Flash and minimizes flash wear, and enables flash to be mirrored for high-availability even to non-Flash based devices for cost reduction. The software employs adaptive 'in-memory' caching technologies to speed up application workloads and optimize write traffic performance to complement Flash read performance. DataCore’s powerful auto-tiering feature works across different vendor platforms optimizing the use of new and existing investments of Flash and storage devices (up to 15 tiers). Other features such as metro-wide mirroring, snapshots and auto-recovery apply to the mix of Flash and disk devices equally well, enabling greater productivity, flexibility and cost-efficiency.

DataCore’s Universal End-to-End Services Platform Unifies ‘Isolated Storage Islands’

SANsymphony-V10 also continues to advance larger scale storage infrastructure management capabilities, cross-device automation and the capability to unify and federate ‘isolated storage islands.’

“It’s easy to see how IT organizations responding to specific projects could find themselves with several disjointed software stacks – one for virtual SANs for each server hypervisor and another set of stacks from each of their flash suppliers, which further complicates the handful of embedded stacks in each of their SAN arrays,” said IDC’s consulting director for storage, Nick Sundby. “DataCore treats each of these scenarios as use cases under its one, unifying software-defined storage platform, aiming to drive management and functional convergence across the enterprise.”

Additional Highlighted Features

The spotlight on SANsymphony-V10 is clearly on the new virtual SAN capabilities, and the new licensing and pricing choices. However, a number of other major performance and scalability enhancements appear in this version as well:

  • Scalability has doubled from 16 to 32 nodes; Enables Metro-wide N+1 grid data protection
  • Supports high-speed 40/56 GigE iSCSI; 16Gbps Fibre Channel; iSCSI Target NIC teaming
  • Performance visualization/Heat Map tools add insight into the behavior of Flash and disks
  • New auto-tiering settings optimize expensive resources (e.g., flash cards) in a pool
  • Intelligent disk rebalancing, dynamically redistributes load across available devices within a tier
  • Automated CPU load leveling and Flash optimizations to increase performance
  • Disk pool optimization and self-healing storage; Disk contents are automatically restored across the remaining storage in the pool; Enhancements to easily select and prioritize order of recovery
  • New self-tuning caching algorithms and optimizations for flash cards and SSDs
  • ‘Click-simple’ configuration wizards to rapidly set up different use cases (Virtual SAN; High-Availability SANs; NAS File Shares; etc.)

Pricing and Availability

Typical multi-node SANsymphony-V10 software licenses start in the $10,000 to $25,000 range. The new Virtual SAN pricing starts at $4,000 per server. The virtual SAN price includes auto-tiering, adaptive read/ write caching from DRAM, storage pooling, metro-wide synchronous mirroring, thin provisioning and snapshots. The software supports all the popular operating systems hosted on VMware ESX and Microsoft Hyper-V environments. Simple ‘Plug-ins’ for both VMware vSphere and Microsoft System Center are included to enable simplified hypervisor-based administration. SANsymphony-V10 and its virtual SAN variations may be deployed in a virtual machine or running natively on Windows Server 2012, using standard physical x86-64 servers.

General availability for SANsymphony-V10 is scheduled for May 30, 2014.

About DataCore

DataCore is a leader in software-defined storage. The company’s storage virtualization software empowers organizations to seamlessly manage and scale their data storage architectures, delivering massive performance gains at a fraction of the cost of solutions offered by legacy storage hardware vendors. Backed by 10,000 customer sites around the world, DataCore’s adaptive and self-learning and healing technology takes the pain out of manual processes and helps deliver on the promise of the new software defined data center through its hardware agnostic architecture. Visit http://www.datacore.com or call (877) 780-5111 for more information.

DataCore, the DataCore logo and SANsymphony are trademarks or registered trademarks of DataCore Software Corporation. Other DataCore product or service names or logos referenced herein are trademarks of DataCore Software Corporation. All other products, services and company names mentioned herein may be trademarks of their respective owners.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Providing secure, mobile access to sensitive data sets is a critical element in realizing the full potential of cloud computing. However, large data caches remain inaccessible to edge devices for reasons of security, size, format or limited viewing capabilities. Medical imaging, computer aided design and seismic interpretation are just a few examples of industries facing this challenge. Rather than fighting for incremental gains by pulling these datasets to edge devices, we need to embrace the i...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...