Click here to close now.




















Welcome!

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo

@BigDataExpo: Article

Does Big Data Need an HPC Boost?

What are firms doing with HPC?

This post is sponsored by The Business Value Exchange and HP Enterprise Services

When should High-Performance Computing (HPC) be considered an integral and essential part of the so-called ‘business transformation' process? This is not a question often posed.

We normally center our fascination with the business transformation process around any given firm's path toward secure (but productive) enterprise mobile computing, Big Data analytics, cloud computing flexibility and new age workflow constructs that embrace social media and online intercommunication.

But how about plain old raw power and High-Performance Computing - or HPC as we normally call it, shouldn't this opportunity to turbo-charge also form part of our current transformation plans?

Advanced and Complex Applications
HPC is defined as a computing environment that employs the use of parallel processing for running what we will call "advanced and complex applications" in an effective and efficient manner. To be more precise and define the term exactly - it applies to any computing environment where the system has the capability to function above a teraflop (or 1012 floating-point operations per second) in its operation.

Although it is true to say that most HPC systems up until now have been tasked with performing compute jobs in fields including scientific research, molecular engineering and (for example) high-grade military uses... thing are changing.

HPC is used to perform tasks including data storage (and, of course, analysis) and what we used to call (and sometime still do) data mining. It will also be used for running complex simulation scenarios and also in deep mathematical calculations and for the visualization of complex data.

... and so today, with so many firms becoming increasingly heavily digitised, the argument to take HPC forward into a wider range of business applications now arises.

In terms of wider usage, HPC can be used to develop, test and redesign products at the same time as optimizing production and delivery processes. Ultimately, Big Data will need HPC in order to be able to store, analyze and produce insight. HPC can also be used (alongside Big Data intelligence) to execute customer trend monitoring, searching and/or profiling.

What Are Firms Doing with HPC?
HP has announced that Airbus has boosted its HPC capacity for aircraft development to 1,200 Teraflops by deploying a new generation of HP Performance Optimized Datacenters (PODs) in Toulouse and Hamburg.

Each of Airbus' 12-meter-long containerized HP PODs delivers the equivalent of nearly 500 square meters of data center space and contains all the elements of an HP Converged Infrastructure: blade servers, storage, networking, software and management (as well as integrated power and cooling).

"Organizations like Airbus need creative scenarios to cater for future business needs," said Peter Ryan, senior vice president & general manager, HP Enterprise Group EMEA. "HP will continue to provide the newest, most powerful technology and operations to support Airbus' HPC for the next five years."

Big Data needs HPC and HPC needs to work on Big Data - this is a marriage made in heaven.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, analyzed a range of cloud offerings (IaaS, PaaS, SaaS) and discussed the benefits/challenges of migrating to each offe...
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 17th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Com...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Scrum Alliance has announced the release of its 2015 State of Scrum Report. Almost 5,000 individuals and companies worldwide participated in this year's survey. Most organizations in the market today are still leading and managing under an Industrial Age model. Not only is the speed of change growing exponentially, Agile and Scrum frameworks are showing companies how to draw on the full talents and capabilities of those doing the work in order to continue innovating for success.
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobi...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at @DevOpsSummit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, presented a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mocku...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Graylog, Inc., has added the capability to collect, centralize and analyze application container logs from within Docker. The Graylog logging driver for Docker addresses the challenges of extracting intelligence from within Docker containers, where most workloads are dynamic and log data is not persisted or stored. Using Graylog, DevOps and IT Ops teams can pinpoint the root cause of problems to deliver new applications faster and minimize downtime.
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Learn how you can use the CoSN SEND II Decision Tree for Education Technology to make sure that your K–12 technology initiatives create a more engaging learning experience that empowers students, teachers, and administrators alike.
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.