The Top Challenges an Industrialized Data Center Faces and How to Overcome Them with Performance-Based Practices
SEATTLE, WA — Data centers are expensive assets, and therefore they require extensive protection. That’s why CAPRE included a critical keynote in the schedule at the Great Northwest Cloud Summit. “The Top Three Challenges an Industrialized Data Center Faces and How to Overcome Them with Performance-Based Practices” by Steven Joseph, Director of Market Development for Xtralis | Honeywell offered a sobering look at the risks inherent to these dense assets as well as advice for data center operators.
“Of course the number one objective is uptime. We hear a lot about downtime and the consequences that it might have on an organization, considering the criticality of data.” According to Joseph, there are separate standards for data centers and for telecommunications, but the bottom line can’t be ignored – uptime is king. Joseph then referred to the results of a 2015 survey which found that a power issue was the number one cause of downtime at that time, followed by fire (though, not surprisingly, fire takes longer to recover from than a power outage).
The keynote then turned to a real-life example – such as a fire that happened at a data center operated by Delta Airlines. Though it was a small, contained fire, over 2000 flights were canceled, and the total cost to the company totaled over $100 Million USD. “Fires can and do happen,” stressed Joseph, citing high heat densities, malfunctioning and degrading electrical & mechanical systems, on-site energy storage sources, and un-vetted “housekeeping” as the main reasons.
“This is why you need to be able to detect a fire before it smoke becomes physical – as soon as you have smoke, you have contamination and you risk disruption,” he explained. “We want to detect as early as possible before smoke is visible.” Joseph then dug deep into that criticality, seeking to shine a light on what is required to make sure that happens, walking the audience through the stages of a potential fire in a high-airflow environment such as a data center.
Joseph explained that Honeywell looks to performance-based strategies to overcome the conditions inherent to a data center – such as spacing between racks. Because even when all codes and standards are otherwise followed, things can still go wrong. “What can happen is that the airflow comes up vertically between the servers…so we don’t get a spread of smoke. You don’t have a ceiling in all of these environments, and if you can’t get the smoke to the detector, then you can miss a fire all together,” he warned.
Joseph then highlighted some of the key variables to consider when tackling smoke detection in a data center — airflow velocity, the ability of smoke to reach a sensor (which includes not only the path of smoke but also any potential obstacles or blockages), temperature (considering the ambient temperature can be a lot higher than in an average room), and placement of a smoke detector itself.
This led Joseph to the highlight of the presentation – Honeywell’s Air Sampling Smoke Detection Systems, which includes highly sensitive technology in the form of an air sampler as well a camera to detect and analyze not only the make-up but also the shape, size, and color of errant air particles. “We get such high sensitivity that we can detect smoke before it becomes visible, which is very important in these environments.”
Joseph then listed some of the benefits of this platform: It’s suitable across a wide range of environments; it is the earliest possible warning without nuisance; it also offers a staged response across fire development stage (before conditions get out of hand); it’s unobtrusive, and non-disruptive; it offers a lower total cost of ownership; and of course it improves visibility for general, more broad, facility diagnostics.
To conclude, Joseph imparted one final thought on the audience – explaining that when a data center operator is looking for a smoke detection system, to remember one thing — it’s all about the specifications. “I can’t tell you how many times I’ve seen a specification that doesn’t cover the performance aspect of a system. It all comes back to the performance-aspects of this system and making sure that they’re well-specified,” he asserted. Joseph clearly made a good point. After all, you’re spending a lot of money on these systems – they should work for you.
E-mail me your stories, industry news tips, and press releases.