Welcome!

News Feed Item

Research Institutions Push the Boundaries of Supercomputing with Dell

At SC13, Dell reaffirmed its long-standing commitment to improve access to and use of high-performance computing (HPC) in research computing. Over the last five years, Dell and its research computing partners have combined integrated server, storage and networking solutions designed specifically for hyperscale and research computing environments with scalable, cost-effective usage models such as HPC-as-a-Service and HPC in the cloud to simplify collaborative science, improve access to compute capacity and accelerate discovery for the research computing community.

Earlier this year, Dell took its commitment a step further, introducing Active Infrastructure for HPC Life Sciences, a converged solution designed specifically for genomics analysis—a very specialized and rapidly growing area of research computing. The new solution integrates computing, storage and networking to reduce lengthy implementation timelines and process up to 37 genomes per day and 259 genomes per week.

Oak Ridge National Laboratory, University of California at San Diego, The University of Texas at Austin, University of Florida, Clemson University, University of Wisconsin at Madison and Stanford University are a few of the hundreds of organizations utilizing Dell’s HPC solutions today to harness the power of data for discovery.

Oak Ridge National Laboratory Supercomputer Achieves I/O Rate of More Than One Terabyte Per Second

To boost the productivity of its Titan supercomputer—the fastest computer in America dedicated solely to scientific research—and better support its 1,200 users and more than 150 research projects, the Oak Ridge National Laboratory (ORNL) Leadership Computing Facility needed a file system with the high speed interconnects that would match the supercomputer’s peak theoretical performance of 27 petaflops or 27,000 trillion calculations per second. Working with Dell and other technology partners, ORNL upgraded its Lustre-based file system Spider” to Spider II to quadruple the size and speed of its file system. It also upgraded the interconnects between Titan and Spider to a new InfiniBand fourteen data rate (FDR) network that can or is designated to be seven times faster and support an I/O rate in excess of one terabyte per second.

The University of California, San Diego Deploying XSEDE’s First Virtualized HPC Cluster with Comet

The San Diego Supercomputer Center (SDSC) at the University of California, San Diego is deploying Comet, a new virtualized petascale supercomputer designed to fulfill pent-up demand for computing in areas such as social sciences and genomics, areas where there is a growing need for computing capacity for a broader set of researchers. Funded by a $12 million NSF grant and scheduled to start operations in early 2015, Comet will be a Dell-based cluster featuring next-generation Intel Xeon processors. With peak performance of nearly two petaflops and the first XSEDE production system to support high-performance virtualization, Comet will be uniquely designed to support many modest-scale jobs: each node will be equipped with two processors, 128 gigabytes (GB) of traditional DRAM and 320 GB of flash memory. Comet will also include some large-scale nodes as well as nodes with NVIDIA GPUs to support visualization, molecular dynamic simulations or genome assembly.

Comet is all about HPC for the 99 percent,” said SDSC Director Michael Norman, Comet principal investigator. “As the world’s first virtualized HPC cluster, Comet is designed to deliver a significantly increased level of computing capacity and customizability to support data-enabled science and engineering at the campus, regional and national levels.”

The University of Texas at Austin to Deploy Wrangler, An Innovative New Data System

The Texas Advanced Computing Center (TACC) at The University of Texas at Austin recently announced plans to build Wrangler, a groundbreaking data analysis and management system for the national open science community that will be funded by a $6 million National Science Foundation (NSF) grant. Featuring 20 petabytes of storage on the Dell C8000 platform and using PowerEdge R620 and R720 compute nodes, Wrangler is designed for high-performance access to community data sets. It will support the popular MapReduce software framework and a full ecosystem of analytics for Big Data when completed in January 2015. Wrangler will integrate with TACC’s Stampede supercomputer and through TACC will be extended to NSF Extreme Science and Engineering Discovery Environment (XSEDE) resources around the country.

Wrangler is designed from the ground up for emerging and existing applications in data intensive science,” said Dan Stanzione, Wrangler’s lead principal investigator and TACC deputy director. “Wrangler will be one of the largest secure, replicated storage options for the national open science community.”

Dell at SC13

Hear from experts from the University of Florida, Clemson University, University of North Texas, University of Wisconsin at Madison and Stanford University about how they are harnessing the power of data for discovery at the “Solving the HPC Data Deluge” session on Nov. 20, 1:30-2:30 p.m. at Dell Booth #1301. And learn about HPC virtualization from the University of California at San Francisco, Florida State University, Cambridge University, Oklahoma University and Australian National University from 3-4 p.m. For more information on Dell’s presence at SC13 visit this blog, and follow the conversation at HPCatDell.

Dell World

Join us at Dell World 2013, Dell’s premier customer event exploring how technology solutions and services are driving business innovation. Learn more at www.dellworld.com, attend our virtual Dell World: Live Online event or follow #DellWorld on Twitter.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...