Welcome!

News Feed Item

Future Facilities Guides Data Center Industry on Calibration for Predictive Modeling

Future Facilities, a leading provider of data center design and operations management software, today announced that Mark Seymour, data center cooling expert and chief technical officer at Future Facilities, has published the first in a series of white papers explaining the importance of model refinement and calibration when predictively modeling the availability, physical capacity and cooling efficiency of a data center. Aimed at owner-operators, What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling brings clarity to an area of data center operations that is increasingly important.

For many data center owner-operators, using computational fluid dynamics (CFD) simulations to predictively model the impact that future changes will have on availability, physical capacity and cooling efficiency (ACE), or to help resolve ACE problems in a data center, is second nature.

And, despite the historical connotations that CFD brings to mind – a complex and intimidating solution requiring expert knowledge to use – the reality is that predictive modeling has never been simpler or easier for the lay person to take advantage of.

But the success of predictive modeling still lies ultimately in the hands of the user. Summed up colloquially as “garbage in, garbage out”, the most pressing dangers for predictive modelers are that their computer models lack fidelity and are uncalibrated. Why? Because low-fidelity models (garbage in) lead to inaccurate results (garbage out) that bear no resemblance to reality (uncalibrated).

For some, the solution to the “garbage in, garbage out” challenge is not to improve the model and calibrate it, but to lazily fix the results of the model to match what is being seen in real life. “That renders the model useless”, says Seymour. Instead, “owner-operators and consultants must exercise due diligence: review and measure the actual installation, then improve the accuracy of the model until it produces dependable results”.

So, how do you make the model dependable? How do you calibrate it? Seymour’s paper provides introductory answers to exactly that question, highlighting that it is a fairly simple process, but one that benefits from a systematic approach. He promises follow-on papers later in the year that will cover specific problem areas, but for the moment he reveals in this paper what 20 years’ experience has taught him are the most common mistakes that people make.

Using real life examples illustrated using Future Facilities’ 6SigmaDC suite of tools, he shows how to overcome systematic errors affecting floor tiles, grilles, cabinets, cable bundles and other common data center objects. Seymour also provides advice on the “tough modeling decisions”, including whether or not to model poorly defined obstructions “such as water pipes under a cooling unit”. Specific advice is provided for calibration of the air supply system and its component parts, with Seymour cautioning upfront, “Do not overlook the fact that it is not just the bulk airflow that matters, but also the flow distribution”.

By the end of the text, the reader will not only have a sound appreciation for good, systematic calibration practice, but also understand that, “while the overall facility is complex, many of the individual elements can be individually assessed”. Seymour concludes by saying, “this will make it possible to diagnose why the initial model does not adequately represent the facility… normally, it won’t!”.

Download the Paper:

What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling is available here:

http://www.futurefacilities.com/media/info.php?id=266

About Mark Seymour:

Mark Seymour is chief technical officer and a founding member at Future Facilities, which this year celebrates its tenth anniversary. With an academic background in applied science and numerical mathematics, Mark enjoyed a successful career in the defense industry for over a decade before moving to the commercial sector. There he has since accumulated 20 years’ experience in the cooling of data center and communication environments. A recognized expert in the predictive modeling of airflow for building HVAC and data centers in particular, Mark is an industrial advisory board member of NSF-ES2 research program and a corresponding member actively participating in ASHRAE TC9.9.

About Future Facilities

For nearly a decade, Future Facilities has provided software and consultancy to the world’s largest data center owner-operators and to leading electronics designers. The company, which is privately funded, optimizes data center utilization through continuous modeling. In doing so, it has saved its customers millions of dollars. Innovative and progressive, Future Facilities is today unique in the market place; it is the only company providing scientifically-sound answers to the what-ifs? that have for so long been impossible to answer with real confidence.

Additional information can be found at http://www.futurefacilities.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"We analyze the video streaming experience. We are gathering the user behavior in real time from the user devices and we analyze how users experience the video streaming," explained Eric Kim, Founder and CEO at Streamlyzer, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"This is specifically designed to accommodate some of the needs for high availability and failover in a network managed system for the major Korean corporations," stated Thomas Masters, Managing Director at InfranicsUSA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We are a leader in the market space called network visibility solutions - it enables monitoring tools and Big Data analysis to access the data and be able to see the performance," explained Shay Morag, VP of Sales and Marketing at Niagara Networks, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...