Welcome!

News Feed Item

Future Facilities Guides Data Center Industry on Calibration for Predictive Modeling

Future Facilities, a leading provider of data center design and operations management software, today announced that Mark Seymour, data center cooling expert and chief technical officer at Future Facilities, has published the first in a series of white papers explaining the importance of model refinement and calibration when predictively modeling the availability, physical capacity and cooling efficiency of a data center. Aimed at owner-operators, What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling brings clarity to an area of data center operations that is increasingly important.

For many data center owner-operators, using computational fluid dynamics (CFD) simulations to predictively model the impact that future changes will have on availability, physical capacity and cooling efficiency (ACE), or to help resolve ACE problems in a data center, is second nature.

And, despite the historical connotations that CFD brings to mind – a complex and intimidating solution requiring expert knowledge to use – the reality is that predictive modeling has never been simpler or easier for the lay person to take advantage of.

But the success of predictive modeling still lies ultimately in the hands of the user. Summed up colloquially as “garbage in, garbage out”, the most pressing dangers for predictive modelers are that their computer models lack fidelity and are uncalibrated. Why? Because low-fidelity models (garbage in) lead to inaccurate results (garbage out) that bear no resemblance to reality (uncalibrated).

For some, the solution to the “garbage in, garbage out” challenge is not to improve the model and calibrate it, but to lazily fix the results of the model to match what is being seen in real life. “That renders the model useless”, says Seymour. Instead, “owner-operators and consultants must exercise due diligence: review and measure the actual installation, then improve the accuracy of the model until it produces dependable results”.

So, how do you make the model dependable? How do you calibrate it? Seymour’s paper provides introductory answers to exactly that question, highlighting that it is a fairly simple process, but one that benefits from a systematic approach. He promises follow-on papers later in the year that will cover specific problem areas, but for the moment he reveals in this paper what 20 years’ experience has taught him are the most common mistakes that people make.

Using real life examples illustrated using Future Facilities’ 6SigmaDC suite of tools, he shows how to overcome systematic errors affecting floor tiles, grilles, cabinets, cable bundles and other common data center objects. Seymour also provides advice on the “tough modeling decisions”, including whether or not to model poorly defined obstructions “such as water pipes under a cooling unit”. Specific advice is provided for calibration of the air supply system and its component parts, with Seymour cautioning upfront, “Do not overlook the fact that it is not just the bulk airflow that matters, but also the flow distribution”.

By the end of the text, the reader will not only have a sound appreciation for good, systematic calibration practice, but also understand that, “while the overall facility is complex, many of the individual elements can be individually assessed”. Seymour concludes by saying, “this will make it possible to diagnose why the initial model does not adequately represent the facility… normally, it won’t!”.

Download the Paper:

What is a Valid Data Center Model? An Introduction to Calibration for Predictive Modeling is available here:

http://www.futurefacilities.com/media/info.php?id=266

About Mark Seymour:

Mark Seymour is chief technical officer and a founding member at Future Facilities, which this year celebrates its tenth anniversary. With an academic background in applied science and numerical mathematics, Mark enjoyed a successful career in the defense industry for over a decade before moving to the commercial sector. There he has since accumulated 20 years’ experience in the cooling of data center and communication environments. A recognized expert in the predictive modeling of airflow for building HVAC and data centers in particular, Mark is an industrial advisory board member of NSF-ES2 research program and a corresponding member actively participating in ASHRAE TC9.9.

About Future Facilities

For nearly a decade, Future Facilities has provided software and consultancy to the world’s largest data center owner-operators and to leading electronics designers. The company, which is privately funded, optimizes data center utilization through continuous modeling. In doing so, it has saved its customers millions of dollars. Innovative and progressive, Future Facilities is today unique in the market place; it is the only company providing scientifically-sound answers to the what-ifs? that have for so long been impossible to answer with real confidence.

Additional information can be found at http://www.futurefacilities.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists discussed...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...