Welcome!

News Feed Item

Arizona State University’s Biodesign Research Institute Leverages DataCore’s Storage Virtualization Platform to Fuel Biomedical Research

DataCore, a leader in software-defined storage, today announced that the Biodesign Institute at Arizona State University (ASU), the second largest metropolitan university in the United States, has successfully adopted a software-defined storage architecture by implementing DataCore’s SANsymphony-V storage virtualization platform. With the help of DataCore, the Biodesign Institute has increased application performance by up to five times, maximized storage capacity of existing investments, improved uptime and significantly reduced costs by removing old storage controllers that had become high-maintenance and unaffordable.

“DataCore enables all the different storage devices that comprise our architecture to communicate and work with each other – even though they come from a wide mixture of vendors – thereby allowing the Institute to gain efficiencies and reduce our costs,” said Scott LeComte, senior IT manager at Arizona State University’s Biodesign Institute. “Just as important is the fact that DataCore’s software is portable and can reside in different locations, meaning we avoid a single point of failure by deploying two DataCore-powered nodes that operate synchronously, campus-wide and automatically can take over for each other in the event of a failure.”

The Biodesign Institute dedicates its research to addressing today’s critical global challenges in healthcare, sustainability and security. The Institute is made up of 14 different research centers, almost $60 million in annual research expenditures and more than 500 personnel who rely on the DataCore-powered storage infrastructure. Because the Institute is so research-intensive, researchers are all required to save large amounts of data generated from experiments that have been conducted for long periods of time. The reason for doing so is that researchers are frequently called to “prove” how they achieved a particular discovery – either by a federal agency, like the Food and Drug Administration (FDA), or by a third party company seeking to buy the research outright.

“When we first got DataCore, it was amazing how easily it fit into the environment and just worked. We are very pleased with just how seamless and non-disruptive the solution has been and its flexibility proves itself time and time again,” added LeComte.

By using DataCore, the Institute’s IT team can now easily and readily pool all storage capacity and provide centralized oversight, provisioning and management to its entire storage architecture. The Biodesign Institute was able to expand its IT environment while cutting costs at the same time as it saw the total cost of supporting one terabyte of storage decrease from $2,500 per TB to $1,000 per TB. SANsymphony-V currently manages more than 300 TBs of total mirrored capacity.

“IT departments, especially at state-run institutions such as Arizona State, are constantly being tasked to find new ways to reduce cost, while ensuring that IT operations run smoothly,” said Steve Houck, COO at DataCore. “By creating a software-defined storage architecture based on SANsymphony-V, the Biodesign Institute at Arizona State was not only able to significantly reduce costs, but increased application performance by up to five times as well.”

A complete case study discussing the DataCore deployment at the Biodesign Institute at Arizona State University is available here: http://datacore.com/testimonials/biodesign-institute-at-asu

About The Biodesign Institute

The Biodesign Institute at ASU addresses today’s critical global challenges in healthcare, sustainability and security by developing solutions inspired from natural systems and translating those solutions into commercially viable products and clinical practices.

About DataCore Software

DataCore is a leader in software-defined storage. The company’s storage virtualization software empowers organizations to seamlessly manage and scale their data storage architectures, delivering massive performance gains at a fraction of the cost of solutions offered by legacy storage hardware vendors. Backed by 10,000 customer sites around the world, DataCore’s adaptive and self-learning and healing technology takes the pain out of manual processes and helps deliver on the promise of the new software defined data center through its hardware agnostic architecture. Visit http://www.datacore.com or call (877) 780-5111 for more information.

DataCore, the DataCore logo and SANsymphony are trademarks or registered trademarks of DataCore Software Corporation. Other DataCore product or service names or logos referenced herein are trademarks of DataCore Software Corporation. All other products, services and company names mentioned herein may be trademarks of their respective owners.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.