|By Business Wire||
|November 18, 2013 10:00 AM EST||
At SC13, Dell reaffirmed its long-standing commitment to improve access to and use of high-performance computing (HPC) in research computing. Over the last five years, Dell and its research computing partners have combined integrated server, storage and networking solutions designed specifically for hyperscale and research computing environments with scalable, cost-effective usage models such as HPC-as-a-Service and HPC in the cloud to simplify collaborative science, improve access to compute capacity and accelerate discovery for the research computing community.
Earlier this year, Dell took its commitment a step further, introducing Active Infrastructure for HPC Life Sciences, a converged solution designed specifically for genomics analysis—a very specialized and rapidly growing area of research computing. The new solution integrates computing, storage and networking to reduce lengthy implementation timelines and process up to 37 genomes per day and 259 genomes per week.
Oak Ridge National Laboratory, University of California at San Diego, The University of Texas at Austin, University of Florida, Clemson University, University of Wisconsin at Madison and Stanford University are a few of the hundreds of organizations utilizing Dell’s HPC solutions today to harness the power of data for discovery.
Oak Ridge National Laboratory Supercomputer Achieves I/O Rate of More Than One Terabyte Per Second
To boost the productivity of its Titan supercomputer—the fastest computer in America dedicated solely to scientific research—and better support its 1,200 users and more than 150 research projects, the Oak Ridge National Laboratory (ORNL) Leadership Computing Facility needed a file system with the high speed interconnects that would match the supercomputer’s peak theoretical performance of 27 petaflops or 27,000 trillion calculations per second. Working with Dell and other technology partners, ORNL upgraded its Lustre-based file system “Spider” to Spider II to quadruple the size and speed of its file system. It also upgraded the interconnects between Titan and Spider to a new InfiniBand fourteen data rate (FDR) network that can or is designated to be seven times faster and support an I/O rate in excess of one terabyte per second.
The University of California, San Diego Deploying XSEDE’s First Virtualized HPC Cluster with Comet
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego is deploying Comet, a new virtualized petascale supercomputer designed to fulfill pent-up demand for computing in areas such as social sciences and genomics, areas where there is a growing need for computing capacity for a broader set of researchers. Funded by a $12 million NSF grant and scheduled to start operations in early 2015, Comet will be a Dell-based cluster featuring next-generation Intel Xeon processors. With peak performance of nearly two petaflops and the first XSEDE production system to support high-performance virtualization, Comet will be uniquely designed to support many modest-scale jobs: each node will be equipped with two processors, 128 gigabytes (GB) of traditional DRAM and 320 GB of flash memory. Comet will also include some large-scale nodes as well as nodes with NVIDIA GPUs to support visualization, molecular dynamic simulations or genome assembly.
“Comet is all about HPC for the 99 percent,” said SDSC Director Michael Norman, Comet principal investigator. “As the world’s first virtualized HPC cluster, Comet is designed to deliver a significantly increased level of computing capacity and customizability to support data-enabled science and engineering at the campus, regional and national levels.”
The University of Texas at Austin to Deploy Wrangler, An Innovative New Data System
The Texas Advanced Computing Center (TACC) at The University of Texas at Austin recently announced plans to build Wrangler, a groundbreaking data analysis and management system for the national open science community that will be funded by a $6 million National Science Foundation (NSF) grant. Featuring 20 petabytes of storage on the Dell C8000 platform and using PowerEdge R620 and R720 compute nodes, Wrangler is designed for high-performance access to community data sets. It will support the popular MapReduce software framework and a full ecosystem of analytics for Big Data when completed in January 2015. Wrangler will integrate with TACC’s Stampede supercomputer and through TACC will be extended to NSF Extreme Science and Engineering Discovery Environment (XSEDE) resources around the country.
“Wrangler is designed from the ground up for emerging and existing applications in data intensive science,” said Dan Stanzione, Wrangler’s lead principal investigator and TACC deputy director. “Wrangler will be one of the largest secure, replicated storage options for the national open science community.”
Dell at SC13
Hear from experts from the University of Florida, Clemson University, University of North Texas, University of Wisconsin at Madison and Stanford University about how they are harnessing the power of data for discovery at the “Solving the HPC Data Deluge” session on Nov. 20, 1:30-2:30 p.m. at Dell Booth #1301. And learn about HPC virtualization from the University of California at San Francisco, Florida State University, Cambridge University, Oklahoma University and Australian National University from 3-4 p.m. For more information on Dell’s presence at SC13 visit this blog, and follow the conversation at HPCatDell.
Join us at Dell World 2013, Dell’s premier customer event exploring how technology solutions and services are driving business innovation. Learn more at www.dellworld.com, attend our virtual Dell World: Live Online event or follow #DellWorld on Twitter.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, will discuss how research has demonstrated the value of Machine Learning in delivering next generation analytics to im...
Apr. 29, 2016 03:45 PM EDT Reads: 1,575
This is not a small hotel event. It is also not a big vendor party where politicians and entertainers are more important than real content. This is Cloud Expo, the world's longest-running conference and exhibition focused on Cloud Computing and all that it entails. If you want serious presentations and valuable insight about Cloud Computing for three straight days, then register now for Cloud Expo.
Apr. 29, 2016 03:30 PM EDT Reads: 1,637
As you respond to increasing requests for new analytics, you need fast and flexible technology in your arsenal so that you can deploy the right workload to the right platform for the need at hand. Do you need self-service and fast time to value? Do you have data and application control and privacy needs, along with strict SLAs to meet? IBM dashDB™ is data warehouse technology powered by in-memory computing and in-database analytics that are designed for fast results, scalability and more.
Apr. 29, 2016 03:15 PM EDT Reads: 1,513
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Apr. 29, 2016 03:07 PM EDT Reads: 110
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Apr. 29, 2016 03:00 PM EDT Reads: 798
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
Apr. 29, 2016 02:45 PM EDT Reads: 783
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Apr. 29, 2016 02:30 PM EDT Reads: 1,376
IoT device adoption is growing at staggering rates, and with it comes opportunity for developers to meet consumer demand for an ever more connected world. Wireless communication is the key part of the encompassing components of any IoT device. Wireless connectivity enhances the device utility at the expense of ease of use and deployment challenges. Since connectivity is fundamental for IoT device development, engineers must understand how to overcome the hurdles inherent in incorporating multipl...
Apr. 29, 2016 02:30 PM EDT Reads: 1,387
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
Apr. 29, 2016 02:00 PM EDT Reads: 813
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, will explain how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
Apr. 29, 2016 02:00 PM EDT Reads: 962
The paradigm has shifted. A Gartner survey shows that 43% of organizations are using or plan to implement the Internet of Things in 2016. However, not just a handful of companies are still using the old-style ad-hoc trial-and-error ways, unaware of the critical barriers, paint points, traps, and hidden roadblocks. How can you become a winner? In his session at @ThingsExpo, Tony Shan will present a methodical approach to guide the holistic adoption and enablement of IoT implementations. This ov...
Apr. 29, 2016 02:00 PM EDT Reads: 1,521
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., will focus on real world deployments of DDoS mitigation strategies in every layer of the network. He will give an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He will also outline what we have found in our experience managing and running thousands of Linux and Unix managed service platforms and what specifically c...
Apr. 29, 2016 01:45 PM EDT Reads: 1,049
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
Apr. 29, 2016 01:30 PM EDT Reads: 790
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
Apr. 29, 2016 01:30 PM EDT Reads: 763
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists will dis...
Apr. 29, 2016 01:30 PM EDT Reads: 454