Welcome!

News Feed Item

Research Institutions Push the Boundaries of Supercomputing with Dell

At SC13, Dell reaffirmed its long-standing commitment to improve access to and use of high-performance computing (HPC) in research computing. Over the last five years, Dell and its research computing partners have combined integrated server, storage and networking solutions designed specifically for hyperscale and research computing environments with scalable, cost-effective usage models such as HPC-as-a-Service and HPC in the cloud to simplify collaborative science, improve access to compute capacity and accelerate discovery for the research computing community.

Earlier this year, Dell took its commitment a step further, introducing Active Infrastructure for HPC Life Sciences, a converged solution designed specifically for genomics analysis—a very specialized and rapidly growing area of research computing. The new solution integrates computing, storage and networking to reduce lengthy implementation timelines and process up to 37 genomes per day and 259 genomes per week.

Oak Ridge National Laboratory, University of California at San Diego, The University of Texas at Austin, University of Florida, Clemson University, University of Wisconsin at Madison and Stanford University are a few of the hundreds of organizations utilizing Dell’s HPC solutions today to harness the power of data for discovery.

Oak Ridge National Laboratory Supercomputer Achieves I/O Rate of More Than One Terabyte Per Second

To boost the productivity of its Titan supercomputer—the fastest computer in America dedicated solely to scientific research—and better support its 1,200 users and more than 150 research projects, the Oak Ridge National Laboratory (ORNL) Leadership Computing Facility needed a file system with the high speed interconnects that would match the supercomputer’s peak theoretical performance of 27 petaflops or 27,000 trillion calculations per second. Working with Dell and other technology partners, ORNL upgraded its Lustre-based file system Spider” to Spider II to quadruple the size and speed of its file system. It also upgraded the interconnects between Titan and Spider to a new InfiniBand fourteen data rate (FDR) network that can or is designated to be seven times faster and support an I/O rate in excess of one terabyte per second.

The University of California, San Diego Deploying XSEDE’s First Virtualized HPC Cluster with Comet

The San Diego Supercomputer Center (SDSC) at the University of California, San Diego is deploying Comet, a new virtualized petascale supercomputer designed to fulfill pent-up demand for computing in areas such as social sciences and genomics, areas where there is a growing need for computing capacity for a broader set of researchers. Funded by a $12 million NSF grant and scheduled to start operations in early 2015, Comet will be a Dell-based cluster featuring next-generation Intel Xeon processors. With peak performance of nearly two petaflops and the first XSEDE production system to support high-performance virtualization, Comet will be uniquely designed to support many modest-scale jobs: each node will be equipped with two processors, 128 gigabytes (GB) of traditional DRAM and 320 GB of flash memory. Comet will also include some large-scale nodes as well as nodes with NVIDIA GPUs to support visualization, molecular dynamic simulations or genome assembly.

Comet is all about HPC for the 99 percent,” said SDSC Director Michael Norman, Comet principal investigator. “As the world’s first virtualized HPC cluster, Comet is designed to deliver a significantly increased level of computing capacity and customizability to support data-enabled science and engineering at the campus, regional and national levels.”

The University of Texas at Austin to Deploy Wrangler, An Innovative New Data System

The Texas Advanced Computing Center (TACC) at The University of Texas at Austin recently announced plans to build Wrangler, a groundbreaking data analysis and management system for the national open science community that will be funded by a $6 million National Science Foundation (NSF) grant. Featuring 20 petabytes of storage on the Dell C8000 platform and using PowerEdge R620 and R720 compute nodes, Wrangler is designed for high-performance access to community data sets. It will support the popular MapReduce software framework and a full ecosystem of analytics for Big Data when completed in January 2015. Wrangler will integrate with TACC’s Stampede supercomputer and through TACC will be extended to NSF Extreme Science and Engineering Discovery Environment (XSEDE) resources around the country.

Wrangler is designed from the ground up for emerging and existing applications in data intensive science,” said Dan Stanzione, Wrangler’s lead principal investigator and TACC deputy director. “Wrangler will be one of the largest secure, replicated storage options for the national open science community.”

Dell at SC13

Hear from experts from the University of Florida, Clemson University, University of North Texas, University of Wisconsin at Madison and Stanford University about how they are harnessing the power of data for discovery at the “Solving the HPC Data Deluge” session on Nov. 20, 1:30-2:30 p.m. at Dell Booth #1301. And learn about HPC virtualization from the University of California at San Francisco, Florida State University, Cambridge University, Oklahoma University and Australian National University from 3-4 p.m. For more information on Dell’s presence at SC13 visit this blog, and follow the conversation at HPCatDell.

Dell World

Join us at Dell World 2013, Dell’s premier customer event exploring how technology solutions and services are driving business innovation. Learn more at www.dellworld.com, attend our virtual Dell World: Live Online event or follow #DellWorld on Twitter.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The essence of data analysis involves setting up data pipelines that consist of several operations that are chained together – starting from data collection, data quality checks, data integration, data analysis and data visualization (including the setting up of interaction paths in that visualization). In our opinion, the challenges stem from the technology diversity at each stage of the data pipeline as well as the lack of process around the analysis.
Many banks and financial institutions are experimenting with containers in development environments, but when will they move into production? Containers are seen as the key to achieving the ultimate in information technology flexibility and agility. Containers work on both public and private clouds, and make it easy to build and deploy applications. The challenge for regulated industries is the cost and complexity of container security compliance. VM security compliance is already challenging, ...
Designing IoT applications is complex, but deploying them in a scalable fashion is even more complex. A scalable, API first IaaS cloud is a good start, but in order to understand the various components specific to deploying IoT applications, one needs to understand the architecture of these applications and figure out how to scale these components independently. In his session at @ThingsExpo, Nara Rajagopalan is CEO of Accelerite, will discuss the fundamental architecture of IoT applications, ...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 18th International CloudExpo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, New York, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty ...
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
As cloud and storage projections continue to rise, the number of organizations moving to the cloud is escalating and it is clear cloud storage is here to stay. However, is it secure? Data is the lifeblood for government entities, countries, cloud service providers and enterprises alike and losing or exposing that data can have disastrous results. There are new concepts for data storage on the horizon that will deliver secure solutions for storing and moving sensitive data around the world. ...
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
SYS-CON Events announced today that MangoApps will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device. For more information, please visit https://www.mangoapps.com/.
18th Cloud Expo, taking place June 7-9, 2016, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some...
SoftLayer operates a global cloud infrastructure platform built for Internet scale. With a global footprint of data centers and network points of presence, SoftLayer provides infrastructure as a service to leading-edge customers ranging from Web startups to global enterprises. SoftLayer's modular architecture, full-featured API, and sophisticated automation provide unparalleled performance and control. Its flexible unified platform seamlessly spans physical and virtual devices linked via a world...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, will provide an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life ...
In his session at 18th Cloud Expo, Bruce Swann, Senior Product Marketing Manager at Adobe, will discuss how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects). Bruce Swann has more than 15 years of experience working with digital marketing disciplines like web analytics, social med...
In the rush to compete in the digital age, a successful digital transformation is essential, but many organizations are setting themselves up for failure. There’s a common misconception that the process is just about technology, but it’s not. It’s about your business. It shouldn’t be treated as an isolated IT project; it should be driven by business needs with the committed involvement of a range of stakeholders.