Click here to close now.


News Feed Item

Over 20 Academic Institutions Now Use Terascala High-Performance Computing Solutions

Terascala, the Fast Data company, today announced that it has deployed its high-performance storage solutions at over 20 universities across the United States. Terascala solutions are propelling some of the most diverse and computationally intensive research taking place on campuses today, from understanding the effects of the Gulf of Mexico oil spill, to identifying the genetic roots of diseases, to modeling astrophysical flows and cosmic structures for understanding how new stars are formed.

Terascala provides the operating system that transforms block storage and controllers from Dell, EMC, and NetApp into an easy-to-manage storage appliance. By abstracting the complexities in individual components, Terascala-powered appliances deliver reliable, predictable throughput at hundreds of gigabytes per second, reducing run times to hours instead of days or weeks.

Computational research presents a very challenging environment for parallel file system storage because of the diversity of applications that exists among the academic community. High-performance computing centers typically serve as a central resource for the entire campus, supporting applications in computational chemistry and biology, genomic sequencing analysis, computational fluid design, finite element analysis, seismic processing, and financial modeling.

Two of the most recent clusters to come online include the George Washington University Colonial One and the University of Florida HiPerGator. GW’s Colonial One cluster was launched on June 25, 2013, to support all of the University’s computational computing needs. Colonial One includes a 300 terabyte Dell|Terascala HSS 4.5 storage appliance (DT-HSS).

The University of Florida’s HiPerGator has a peak speed of 150 trillion calculations per second, and includes a 2.88 petabyte Dell|Terascala HSS 4.5 storage appliance. “In the coming months, some 500 researchers and 160 teams with more than $175 million in research funding will be using HiPerGator to run their experiments,” said Dr. David Norton, Vice President for Research at the University of Florida. UF recently received an $8 million federal award from the National Nuclear Security Administration, along with the designation as a center of excellence, as a result of its high-performance computing capabilities.

“Having over 20 universities using our storage solutions is a significant accomplishment for Terascala,” said Steve Butler, CEO. “The fact that Terascala-powered storage is doing so well in these tough environments is a very strong indication of the robustness of the solution to perform well, as well as its ability to be easily tuned for very diverse workloads.”


Dell|Terascala HSS 4.5 Product Information:

Terascala Case Studies and White Papers:

UF Video - HiPerGator Open for Business:

GW Video - Colonial One Launch Event:

About Terascala

Terascala is the fast data company. Terascala storage appliances dramatically accelerate the time to insight for organizations that rely on simulation, analysis, and modeling tools to bring new products and innovation to market. Exclusively available through strategic partners Dell, EMC, and NetApp, Terascala storage appliances provide on-demand throughput at multiple gigabytes per second while leveraging industry-leading storage platforms for long-term data protection. Learn more at

Connect with Terascala

Read our blog:
Follow us on Twitter:

Terascala and the Terascala logo are trademarks of Terascala, Inc. All other brands, products, or service names may be trademarks or property of their respective holders.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
JFrog has announced a powerful technology for managing software packages from development into production. JFrog Artifactory 4 represents disruptive innovation in its groundbreaking ability to help development and DevOps teams deliver increasingly complex solutions on ever-shorter deadlines across multiple platforms JFrog Artifactory 4 establishes a new category – the Universal Artifact Repository – that reflects JFrog's unique commitment to enable faster software releases through the first pla...
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Chris Van Tuin, Chief Technologist for the Western US at Red Hat, has over 20 years of experience in IT and Software. Since joining Red Hat in 2005, he has been architecting solutions for strategic customers and partners with a focus on emerging technologies including IaaS, PaaS, and DevOps. He started his career at Intel in IT and Managed Hosting followed by leadership roles in services and sales engineering at Loudcloud and Linux startups.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
The APN DevOps Competency highlights APN Partners who demonstrate deep capabilities delivering continuous integration, continuous delivery, and configuration management. They help customers transform their business to be more efficient and agile by leveraging the AWS platform and DevOps principles.