Welcome!

News Feed Item

Future Facilities Guides Data Center Industry on Calibration for Predictive Modeling

Future Facilities, a leading provider of data center design and operations management software, today announced that Mark Seymour, data center cooling expert and chief technical officer at Future Facilities, has published the first in a series of white papers explaining the importance of model refinement and calibration when predictively modeling the availability, physical capacity and cooling efficiency of a data center. Aimed at owner-operators, What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling brings clarity to an area of data center operations that is increasingly important.

For many data center owner-operators, using computational fluid dynamics (CFD) simulations to predictively model the impact that future changes will have on availability, physical capacity and cooling efficiency (ACE), or to help resolve ACE problems in a data center, is second nature.

And, despite the historical connotations that CFD brings to mind – a complex and intimidating solution requiring expert knowledge to use – the reality is that predictive modeling has never been simpler or easier for the lay person to take advantage of.

But the success of predictive modeling still lies ultimately in the hands of the user. Summed up colloquially as “garbage in, garbage out”, the most pressing dangers for predictive modelers are that their computer models lack fidelity and are uncalibrated. Why? Because low-fidelity models (garbage in) lead to inaccurate results (garbage out) that bear no resemblance to reality (uncalibrated).

For some, the solution to the “garbage in, garbage out” challenge is not to improve the model and calibrate it, but to lazily fix the results of the model to match what is being seen in real life. “That renders the model useless”, says Seymour. Instead, “owner-operators and consultants must exercise due diligence: review and measure the actual installation, then improve the accuracy of the model until it produces dependable results”.

So, how do you make the model dependable? How do you calibrate it? Seymour’s paper provides introductory answers to exactly that question, highlighting that it is a fairly simple process, but one that benefits from a systematic approach. He promises follow-on papers later in the year that will cover specific problem areas, but for the moment he reveals in this paper what 20 years’ experience has taught him are the most common mistakes that people make.

Using real life examples illustrated using Future Facilities’ 6SigmaDC suite of tools, he shows how to overcome systematic errors affecting floor tiles, grilles, cabinets, cable bundles and other common data center objects. Seymour also provides advice on the “tough modeling decisions”, including whether or not to model poorly defined obstructions “such as water pipes under a cooling unit”. Specific advice is provided for calibration of the air supply system and its component parts, with Seymour cautioning upfront, “Do not overlook the fact that it is not just the bulk airflow that matters, but also the flow distribution”.

By the end of the text, the reader will not only have a sound appreciation for good, systematic calibration practice, but also understand that, “while the overall facility is complex, many of the individual elements can be individually assessed”. Seymour concludes by saying, “this will make it possible to diagnose why the initial model does not adequately represent the facility… normally, it won’t!”.

Download the Paper:

What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling is available here:

http://www.futurefacilities.com/media/info.php?id=266

About Mark Seymour:

Mark Seymour is chief technical officer and a founding member at Future Facilities, which this year celebrates its tenth anniversary. With an academic background in applied science and numerical mathematics, Mark enjoyed a successful career in the defense industry for over a decade before moving to the commercial sector. There he has since accumulated 20 years’ experience in the cooling of data center and communication environments. A recognized expert in the predictive modeling of airflow for building HVAC and data centers in particular, Mark is an industrial advisory board member of NSF-ES2 research program and a corresponding member actively participating in ASHRAE TC9.9.

About Future Facilities

For nearly a decade, Future Facilities has provided software and consultancy to the world’s largest data center owner-operators and to leading electronics designers. The company, which is privately funded, optimizes data center utilization through continuous modeling. In doing so, it has saved its customers millions of dollars. Innovative and progressive, Future Facilities is today unique in the market place; it is the only company providing scientifically-sound answers to the what-ifs? that have for so long been impossible to answer with real confidence.

Additional information can be found at http://www.futurefacilities.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
SYS-CON Events announced today that ReadyTalk, a leading provider of online conferencing and webinar services, has been named Vendor Presentation Sponsor at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. ReadyTalk delivers audio and web conferencing services that inspire collaboration and enable the Future of Work for today’s increasingly digital and mobile workforce. By combining intuitive, innovative tec...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
SYS-CON Events announced today that Secure Channels will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The bedrock of Secure Channels Technology is a uniquely modified and enhanced process based on superencipherment. Superencipherment is the process of encrypting an already encrypted message one or more times, either using the same or a different algorithm.
Vidyo, Inc., has joined the Alliance for Open Media. The Alliance for Open Media is a non-profit organization working to define and develop media technologies that address the need for an open standard for video compression and delivery over the web. As a member of the Alliance, Vidyo will collaborate with industry leaders in pursuit of an open and royalty-free AOMedia Video codec, AV1. Vidyo’s contributions to the organization will bring to bear its long history of expertise in codec technolo...
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
The vision of a connected smart home is becoming reality with the application of integrated wireless technologies in devices and appliances. The use of standardized and TCP/IP networked wireless technologies in line-powered and battery operated sensors and controls has led to the adoption of radios in the 2.4GHz band, including Wi-Fi, BT/BLE and 802.15.4 applied ZigBee and Thread. This is driving the need for robust wireless coexistence for multiple radios to ensure throughput performance and th...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?