Welcome!

News Feed Item

Texas A&M System Teams with IBM to Drive Computational Sciences Research through Big Data and Analytics

High performance computing (HPC) system will speed research to advance energy resource management, accelerate materials development, ensure the sustainability of food supplies, and improve animal health

COLLEGE STATION, Texas and ARMONK, N.Y., Jan. 29, 2014 /PRNewswire/ -- Texas A&M University System and IBM (NYSE: IBM) today announced an agreement that is the beginning of a broad research collaboration supported by one of the largest computational sciences infrastructure dedicated to advances in agriculture, geosciences and engineering.

(Logo:  http://photos.prnewswire.com/prnh/20090416/IBMLOGO )

The collaboration will leverage the power of big data analytics and high performance computing (HPC) systems for innovative solutions across a spectrum of challenges, such as improving extraction of Earth-based energy resources, facilitating the smart energy grid, accelerating materials development, improving disease identification and tracking in animals, and fostering better understanding and monitoring of our global food supplies.

"Combining the incredible intellectual and technological resources of Texas A&M University and IBM will further position Texas as a leader in identifying and solving some of the most complex challenges we face," Texas Gov. Rick Perry said. "The work that will be done here will change lives and potentially save lives not just in our state, but our nation and around the world."

IBM will provide the infrastructure for the joint research consisting of Blue Gene/Q technology, Power and System x servers, and General Parallel File Systems (GPFS) Storage Systems. A test of the Blue Gene/Q on campus found that it ran a material sciences problem that previously took weeks to solve and produced a solution in "a fraction of an hour" with much greater analytical depth.

"The Texas A&M System and IBM share a passion and a commitment to research that identifies practical solutions to global challenges," said Chancellor John Sharp, Texas A&M University System. "As the largest research university in the state, this agreement is a major step forward for the A&M System in research computing power. This brings together the best computer scientists and technology in the world to focus on issues so important to our role as a leading research institution and to our land-grant mission of serving the state while also providing resources to serve the greater good throughout the world."

IBM Research and the A&M System intend to align skills, assets and resources to pursue fundamental research, applied development, educational reach and sustainable commercial activities with projects that may include:

  • Sustainable Availability of Food: Efficiently providing sufficient food for a growing global population
  • Disease Spread Tracking, Modeling and Prediction: Early and accurate detection and prediction of infectious disease spread to allow the design, testing and manufacturing of medical countermeasures
  • Energy Resource Management: Responsibly explore, extract, and deliver energy resources
  • New Materials Development: Atomic-level modeling, design and testing of new materials for advanced applications in energy, aerospace, structural and defense applications

As a premier engineering research agency of Texas, Texas A&M Engineering Experiment Station (TEES), which conducts research to provide practical answers to critical state and national needs, will be heavily involved from the Texas A&M University System and according to Katherine Banks, Director of TEES and Vice Chancellor of Engineering, "This is a unique opportunity to meet the needs of engineering, geosciences and agriculture and life sciences researchers to expand in areas not feasible before with small-scale HPC systems."

"IBM and the Texas A&M System have crafted a unique collaboration that could apply computational science and big data analytics to some of the most daunting problems in agriculture, geosciences and engineering," said William LaFontaine, Vice President of High Performance Analytics and Cognitive Markets at IBM. "With the combined research capabilities of both institutions and ready access to state-of-the-art computing technology, we feel this collaboration could produce significant scientific insights leading to industry-changing solutions and material economic impact. We are extremely pleased to be engaged with such extraordinarily capable institutions in the A&M System and look forward to years of discovery and innovation."

TEES partners with academic institutions, governmental agencies, industries, and communities to solve problems to help improve the quality of life, promote economic development, and enhance the educational systems of Texas. It is intimately connected with the College of Engineering of Texas A&M University, which is undergoing an unprecedented growth to become a College with 25,000 students by the year 2025 and hire a new generation of faculty who will be addressing the Nation's needs for research and technology development. 

In support of the long-term research effort, IBM will supply to the A&M System cutting edge technical computing technologies, which will be cloud-enabled. The A&M System will deploy a research computing cloud that will comprise of IBM hardware and software including:

  • Blue Gene/Q: Serving as the foundation of the computing infrastructure, a Blue Gene/Q system consisting of two racks, with more than 2,000 compute nodes, will provide 418 teraflops (TF) of sustained performance for big data analytics, complex modeling, and simulation of molecular dynamics, protein folding and organ modeling.
  • Power Systems: A total of 75 PowerLinux 7R2 servers with POWER7+ microprocessors will be connected by 10GbE into a system optimized for big data and analytics and high performance computing. This complex includes IBM BigInsights and Platform Symphony software, IBM Platform LSF scheduler, and IBM General Parallel File System.
  • System x: The solution will contain an estimated 900 IBM System x dense hyperscale compute nodes as part of an IBM NeXtScale system. Some of the nodes will be managed by Platform Cluster Manager Advanced Edition (PCM-AE) as a University-wide HPC cloud while the others will be managed by Platform Cluster Manager Standard Edition (PCM-SE) and serve as a general purpose compute infrastructure for the geosciences and open source analytics initiatives.
  • Platform Computing: Platform Computing software will be used to manage and accelerate various computational workloads. Platform Symphony will drive big data and analytics, and Platform LSF will drive traditional HPC and technical computing workloads. Platform Computing will also power the creation of an HPC cloud, allowing users within the A&M System access to the system.
  • General Parallel File System (GPFS): Five IBM System x GPFS Storage Servers (GSS) will provide five petabytes (PB) of shared storage for use by the compute building blocks using high-speed networks. GPFS will also include an IBM FlashSystem 820 tier with 10 terabytes (TB) of flash storage, delivering performance to accelerate computation for use primarily by Texas A&M Agrilife Research, Geosciences and university HPC as a part of the research computing infrastructure.

Furthermore, IBM will work with researchers at the A&M System to assess new computing technologies that will be necessary to advance data-driven science discovery and innovation over the next several years.

About IBM
For more information on IBM Research visit www.research.ibm.com.
For more information on IBM Technical Computing visit www.ibm.com/systems/technicalcomputing/.

About the A&M System
The A&M System is one of the largest systems of higher education in the nation, with a budget of $3.5 billion. Through a statewide network of 11 universities, seven state agencies, two service units, a comprehensive health science center and a system administration office, the A&M System educates more than 125,000 students and makes more than 22 million additional educational contacts through service and outreach programs each year. Externally funded research expenditures exceed $780 million and help drive the state's economy.

Contact:
Ciri Haugh
617-693-2345
[email protected]

SOURCE IBM

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Enterprises have forever faced challenges surrounding the sharing of their intellectual property. Emerging cloud adoption has made it more compelling for enterprises to digitize their content, making them available over a wide variety of devices across the Internet. In his session at 19th Cloud Expo, Santosh Ahuja, Director of Architecture at Impiger Technologies, will introduce various mechanisms provided by cloud service providers today to manage and share digital content in a secure manner....
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...