Next Generation Data Center Cooling Strategies: COPT, Vertiv, Pixel Factory, Subzero, HP
by Brian Klebash
ASHBURN, Virginia – CAPRE’s Fourth Annual Washington, D.C. & Mid-Atlantic Data Center Summit kicked off with a Special Breakfast Workshop titled Next Generation Data Center Cooling Strategies.
John Peter Valiulis, Vice President at Emerson Network Power moderated this bright and early four-person panel, and started off the panel by asking Jeff Ivey, General Manager of Data Center Operations at Corporate Office Properties Trust about his firm’s latest cooling strategies and goals.
“One of the biggest challenges we have is costs – operating costs going forward.” said Ivey. “As we look at new strategies, we’re looking at ways to bring our efficiencies down, understanding that in the colocation markets, our customers aren’t utilizing the space that they’re buying.” Ivey said that customers usually require 300-400 kilowatts worth of power. He then continued, “You still want to be able to provide them with efficient cooling strategies and not spend a fortune on it. And figure out a product that’s going ti get you that efficiency without going crazy on your operating costs.”
Next, Valiulis turned to Scott Brown, President of Pixel Factory, Inc. for a different perspective on this topic. Brown, who operates a smaller data center, replied “We struggle with Capex and Opex for equipment. We’ve really had to think out of the box to make sure that we can keep our PUE down and service our customers and still turn a profit.” Brown said that Pixel doesn’t have thermal cooling or modeling software.
The topic then shifted to the government’s strategies and goals when it comes to thermal cooling, and Valiulis asked John Peterson, Technical Program Manager for HP Critical Facilities Services delivered by EYP MCF to expound on his experiences working with the U.S. government.
Peterson obliged, saying “A couple of years ago we saw consolidation efforts. Now we’re seeing Executive Order 13693 starting to take effect. A lot of people are saying ‘We’ve got to come up with all these strategies to get to that 1.5 PUE for all existing data centers.” No more building new data centers for the government for awhile, right? You’ve got to have some sort of super exception to get something built…so a lot of people are saying, “How high can we take air cooling? What about water cooling? What are the solutions there?” Peterson likened this to working with colocation centers, since a data center manager might work with 3 or 4 particular groups that each have to pay for their own cooling strategies.
Larry Mainers, Founder & CEO of SubZero Engineering then chimed in about what he is seeing from customers he supplies solutions for on this front. “We believe that the separation of supply and return airflow is just an air conditioning fundamental,” Mainers began. “That’s a foundation. Once you separate those two airflows, then you can start adding all kinds of interesting systems outside air return. Containment is the solid foundation of any kind of thermo-cooling system.”
The perspective returned to Jeff Ivey as the panel transitioned to the topic of emerging technologies. “Things like the wheel as a cooling method that you can go to for a little more free cooling. We go air side, we go water side, and pushing those limits further and further is what we’re aiming for,” Ivey explained. “We’ve looked at all the data, like does it matter if it’s in Richmond or San Antonio or D.C.? We’ve applied all the different zones, and basically the economizer hours on the direct systems just weren’t there. We weren’t getting percentages out of those direct systems, that we were even with a water-proof chiller. Performance and economization were much better and the PUEs were lower.”
Next there was some brief discussion about changing expectations regarding cooling efficiency, to which Brown shared a story about how his firm led the recovery of Bank of New York after September 11. Due to window blasts, 500,000 square feet of data center were exposed to 6-8 inches of dust and the warm outside air. However, 80% of data was recovered. Brown said the moral of the story is that the focus with freezing data centers is misplaced, some warmth can in fact be a very welcome cost-saver.
The final topic on deck was the changing role of water in cooling strategies, to which Ivey again chimed in his thoughts. Said Ivey, “We’ve actually thought about giving up some efficiency to get rid of water.” Ivey cited many reasons as leading to this — cost, sustainability, simplicity, staffing, and more. Peterson echoed these sentiments on the governmental sector.
Valiulis opened the panel up to questions from the audience. One attendee inquired about the impact of a top-notch water reclamation facility in a place like Ashburn, Virginia on Ivey’s cooling calculus, to which Ivey responded that it depends on the application. “If it’s a single user enterprise level or if it’s a colocation, it depends on the efficiency of the building,” he replied. “In Ashburn we’re doing some with water, it’s the most efficient way to go. In other areas we’re getting away from water.”
The next question built on this topic, asking “If data centers could use less water, what kind of percentage effect do you see on PUE sin future?” to which Ivey responded that it would be an incremental effect – decimals, to some chuckles. “.212, .213. It takes containment, and all of the other components along the way to get there…running a DC this time of the year vs pushing water through, your efficiencies depend on the time of year. We’re now getting to the point where we’re gonna open it up and going to free cooling airside. We do waterside all year, but now we’re starting to push into the air side. Our filter changes go up, there are costs that go along with that, but the costs are very small.”
With that answer, the Special Breakfast Workshop concluded and the Events’ Fourth Annual Washington, D.C. & Mid-Atlantic Data Center Summit continued to the Keynote Presentation, Tucker Intelligence: Comparing and Contrasting National and Mid-Atlantic Data Center Markets.