menu

CapRE’s Mid-Atlantic Data Center Summit Preview: Innovations in Power, Cooling, Monitoring and Analytics with Henry Amistadi

Sep 11, 2018
by Josh Anderson

LEESBURG, VA — Henry Amistadi has many years of experience working both sides of enterprise data centers, as an IT performance analyst and a data center HVAC/Facilities analyst/IT Performance Analyst. As a performance analyst and tester for +10 years, he tested or analyzed the performance of every kind of hardware and software. He performed many application usage and VM performance analyzes. His application workloads collection is used when estimating the cost of moving apps to the cloud. He was involved in virtualization of many Enterprise apps.

Prior to MITRE, Henry had +15 year’s experience designing and developing building HVAC design and energy software for Mechanical Engineers, Utility companies, HVAC manufacturers. He was an active member of ASHRAE Energy Calculation Technical Committee. Amistadi will be a featured speaker at CapRE’s Leesburg Data Center Summit tomorrow, where he will share details about an innovative data center modernization project on the panel “Data Center Management & Operations: Ensuring Cloud Service Readiness, Availability, Power & Cooling” from 2:30 pm – 3:15 pm. Below is a Q&A with Henry in which he shares a sneak peak into the project.

CapRE: Good to speak with you Henry. We’re looking forward to learning about this at our Leesburg Data Center Summit on Wednesday. First tell us about yourself and your data center modernization project.

Henry Amistadi, Retired, MITRE

Amistadi: First, I should emphasize that I just retired from MITRE, so my opinions are my own and not MITRE’s. For some context, what’s unique about my qualifications is my expertise in three areas: IT performance & operations, data center power & cooling, and data analytics & machine learning. Not many people are equally versed in those three topics. Usually there are divides between Facilities, IT, and senior management. Because of my background, I bridged these gaps at MITRE.

Our mandate was to modernize the data center with state-of-the-art power and cooling technology to provide a show case to our other Centers that advise the federal government. To this end, I thoroughly evaluated new power and cooling technologies. We decided on Opticool, an active rear-door, pumped refrigerant heat extraction system, cooled by available and efficient campus chilled water. For power distribution we supplied 415V power to the cabinets instead of 240V.

To gain approval from the Director and CIO, we had to justify that our staged/modular approach to expansion, that reused existing data center space, was more cost effective then CoLo.

Facilities was resistant to new technologies they weren’t familiar with. It was my job to answer their questions and concerns so they would be onboard. Opticool isn’t CRAC-based; it’s a rear-door system.  It uses pumped refrigerant, not a compressor or water.  We were their first installation in New England and it was “bleeding edge” technology six years ago. Because it was innovative, and Facilities had concerns, we heavily monitored the cabinet temperatures and power use and Opticool’s performance.

We learned how to tune the system to improve performance through a series of experiments. We were able to do this by using Tableau to bring together data from the disparate monitoring systems and compare the results before and after. This data & visualization from “the field” proved to be useful to the vendor as well.

CapRE: Very interesting. So how did this all come about?

Amistadi: MITRE is an R&D company that operates many Centers for the federal government, such as the Air Force. So even people like me, in Corporate IT, have an opportunity to innovate, since in this case we provided a showcase for innovative data center technology that the Centers can show their customers. We got to spend more time researching the power and cooling technology before making our design decisions and were able to install a research quality monitoring system. Not all corporations have this luxury.

data center summitOnce the power and cooling systems was operational, we took an innovative approach to monitoring and analytics. I built a Tableau workbook that brought together the cabinet temperature and power data with campus and Opticool BAS data. Being able to visualize data from all three systems, for the same time frame, shows how the systems interact. Named the “data center analyst workbench”, it provided a showcase of what you could learn by analyzing detailed monitoring data. It was instrumental in diagnosing problems, fine-tuning systems and documenting issues for the vendor.

CapRE: Tell us about the data center analyst workbench approach and how it compares to DCIM.

Amistadi: What I am proposing is another direction for data center analytics. Traditionally, people have some form of HVAC and power monitoring from their building automation system (BAS) and in some cases may have additional temperature and power monitoring at the Aisle or cabinet level. Typically, there is separate software for temperature, power and HVAC. In some cases, there may be DCIM software that brings the data together to provide real time operational awareness. DCIM products are powerful and expensive workhorses for operations and are implemented all at once. They tend to have canned reports with limited customization or the ability to add new reports (and if they do it’s in a proprietary language). DCIM can include many facets such asset and configuration management. For this discussion, I am limiting the scope to power and cooling performance and IT environmental conditions.

From this perspective, it’s possible to build the same functionality using Tableau, a general-purpose visualization tool, that brings together the data from the different software’s databases. Then you can build your own application incrementally, focusing not only on current conditions, but also addressing trends and analytics to diagnose or predict problems. Broadening the perspective, this approach can also be used to complement DCIM capabilities.

The data center analyst workbench approach is best suited to Electrical or HVAC engineers, or Data center analysts responsible for trouble shooting, tuning and diagnosing or predicting problems. It looks at trends and relationships between variables (from different systems) to help people solve problems quickly.

The incremental approach means that the history of the development of addressing problems results in a collection of visualizations that are used into the future. Workbooks grow organically. We started with three sheets, and after five years, there are twenty.

The workbench is used to address management questions, problem-prone areas, and changes in operations, which increases reliability and energy savings while reducing downtime and time to repair. That’s the impact of analytics.

CapRE: Is it a product or an approach?

Amistadi: A Data Center Analyst Workbench (DCAW) is an approach. Each implementation is custom built. Its goal is to enable staff to do self-service analytics of data center operations & operating costs.

However, if you ask any data center person about the DCAW approach, they will not have heard of it. It’s a new concept. People have separate disparate systems or an Enterprise DCIM system. No one knows that you can pull data from different systems into Tableau, start from scratch, and develop combined views of cooling and IT temperatures & power. It’s using new technology to visualize and analyze data center monitoring data. Tableau’s traditional market has been to supplement Enterprise Resource Planning (ERP) and Customer relationship management (CRM) systems to do custom sales and marketing visualization and analysis.

CapRE: Tell us more about the self-service analytics movement in general.

Amistadi: There is a growing interest in data analytics. It started with e-commerce and moved into Corporate operations. Currently, it’s focused on the income side, primarily sales and marketing. In that space, there are mega-product suites such as Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) software that companies spend millions of dollars on, e.g.  Oracle Financials, Cognos, and SalesForce.

It turns out that even though Corporate has these mega product suites and BI departments that generate reports for people, there is a massive movement of people in the financial, sales, and marketing departments towards using Tableau for self-serve analytics. This is because they’re not getting what they want from the BI or data services departments fast enough, and they want to build their own analytics. There are thousands of people and many companies enabling their workforce with self-service analytics tools like Tableau.

The data center analyst workbench approach applies self-service analytics to the expense side of the house, in particular, data centers. This isn’t being done in the industry yet.

CapRE: What’s the next phase of the data center analyst workbench approach?

Amistadi: The workbench provides visualizations or descriptive analytics. Beyond visualization, which is manual – observing that “this” went down when “that” went up- is machine learning, which automatically detects changes in behavior. This way, if you have lots of cabinets and cooling equipment, you don’t have to drill down through many graphs, looking for problems. What machine learning does is identify and prioritize the important graphs for you.

That’s the next phase. I wrote a paper on anomaly detection in IT systems, and plan to apply it to data centers. There are also other machine learning techniques that can be used for data center diagnostics, predictions and decision support.

CapRE: Great! Thank you for your time Henry. We look forward to hearing more about this from you in Leesburg.

Sign Up For Updates:

Categories

Yes No