Welcome!

News Feed Item

Arizona State University’s Biodesign Research Institute Leverages DataCore’s Storage Virtualization Platform to Fuel Biomedical Research

DataCore, a leader in software-defined storage, today announced that the Biodesign Institute at Arizona State University (ASU), the second largest metropolitan university in the United States, has successfully adopted a software-defined storage architecture by implementing DataCore’s SANsymphony-V storage virtualization platform. With the help of DataCore, the Biodesign Institute has increased application performance by up to five times, maximized storage capacity of existing investments, improved uptime and significantly reduced costs by removing old storage controllers that had become high-maintenance and unaffordable.

“DataCore enables all the different storage devices that comprise our architecture to communicate and work with each other – even though they come from a wide mixture of vendors – thereby allowing the Institute to gain efficiencies and reduce our costs,” said Scott LeComte, senior IT manager at Arizona State University’s Biodesign Institute. “Just as important is the fact that DataCore’s software is portable and can reside in different locations, meaning we avoid a single point of failure by deploying two DataCore-powered nodes that operate synchronously, campus-wide and automatically can take over for each other in the event of a failure.”

The Biodesign Institute dedicates its research to addressing today’s critical global challenges in healthcare, sustainability and security. The Institute is made up of 14 different research centers, almost $60 million in annual research expenditures and more than 500 personnel who rely on the DataCore-powered storage infrastructure. Because the Institute is so research-intensive, researchers are all required to save large amounts of data generated from experiments that have been conducted for long periods of time. The reason for doing so is that researchers are frequently called to “prove” how they achieved a particular discovery – either by a federal agency, like the Food and Drug Administration (FDA), or by a third party company seeking to buy the research outright.

“When we first got DataCore, it was amazing how easily it fit into the environment and just worked. We are very pleased with just how seamless and non-disruptive the solution has been and its flexibility proves itself time and time again,” added LeComte.

By using DataCore, the Institute’s IT team can now easily and readily pool all storage capacity and provide centralized oversight, provisioning and management to its entire storage architecture. The Biodesign Institute was able to expand its IT environment while cutting costs at the same time as it saw the total cost of supporting one terabyte of storage decrease from $2,500 per TB to $1,000 per TB. SANsymphony-V currently manages more than 300 TBs of total mirrored capacity.

“IT departments, especially at state-run institutions such as Arizona State, are constantly being tasked to find new ways to reduce cost, while ensuring that IT operations run smoothly,” said Steve Houck, COO at DataCore. “By creating a software-defined storage architecture based on SANsymphony-V, the Biodesign Institute at Arizona State was not only able to significantly reduce costs, but increased application performance by up to five times as well.”

A complete case study discussing the DataCore deployment at the Biodesign Institute at Arizona State University is available here: http://datacore.com/testimonials/biodesign-institute-at-asu

About The Biodesign Institute

The Biodesign Institute at ASU addresses today’s critical global challenges in healthcare, sustainability and security by developing solutions inspired from natural systems and translating those solutions into commercially viable products and clinical practices.

About DataCore Software

DataCore is a leader in software-defined storage. The company’s storage virtualization software empowers organizations to seamlessly manage and scale their data storage architectures, delivering massive performance gains at a fraction of the cost of solutions offered by legacy storage hardware vendors. Backed by 10,000 customer sites around the world, DataCore’s adaptive and self-learning and healing technology takes the pain out of manual processes and helps deliver on the promise of the new software defined data center through its hardware agnostic architecture. Visit http://www.datacore.com or call (877) 780-5111 for more information.

DataCore, the DataCore logo and SANsymphony are trademarks or registered trademarks of DataCore Software Corporation. Other DataCore product or service names or logos referenced herein are trademarks of DataCore Software Corporation. All other products, services and company names mentioned herein may be trademarks of their respective owners.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.