CAPRE Exclusive | Staying Relevant as a Data Center Provider: Keep Ahead of the Market, Stay Flexible, Prioritize Security

DENVER, CO – A centerpiece of CAPRE’s International Data Center Series is how to navigate and capitalize on a changing industry and changing enterprise landscape. That’s why CAPRE’s Construction, Design, Engineering Evolution and the Optimization of Hyperscale, Colocation & Enterprise Data Centers in Denver featured a discussion titled “Staying Relevant as a Data Center Provider: Digital Transformation and Optimizing the Client’s Journey to the Cloud” featuring a pair of forward-thinking though leaders from two highly influential organizations: Ed Kimm, of IBM’s Global Technology Services team and Craig Cook, Vice-President for Solutions Architecture and Engineering at Flexential.

Ed Kimm, Global Technology Services, IBM

“One reason I joined IBM was that we were not trying to compete against the other hyperscalers. In my opinion we wouldn’t ever catch them,” began Kimm. “Where I see the market going is in an open, hybrid, and agnostic cloud eco-system. One cloud doesn’t fit everyone. There are different needs and different strokes for different folks. IBM is changing the game when it comes to managing multi-cloud hybrid environments.”

This is why, when it comes to figuring out where to put a workload and why, IBM knows what it’s doing. “IBM is the clearest leader when it comes to orchestration and managing workloads, even in disparate agnostic IT environments,” Kimm shared, with an eye to the future. “That is where I see the market is headed. Agnostic, hybrid cloud, open source.”

After some discussion about the latest at IBM and, moderator Brian Klebash, Founder and CEO of CAPRE asked Kimm and his panelist Cook, can you have too many clouds?

Kimm served up the first response. “It depends whom you ask,” he replied. “The multi-cloud [concept] is predicated upon hyperscalers. You have community clouds, private clouds, colo, and more of the up and down versus the left and right. There is an inflection point where organizations are doing capex [investments for] infrastructure, but they’re also doings stuff at their Edge.” In other words, it’s not about too many Clouds, but more about how to manage the Clouds you do have.

At that point, Flexential’s Cook chimed in. “You’ve always got trade offs in striking that balance. There’s no one answer that works for everyone,” he concurred. “Everybody has got to make this decision themselves. Going back to something Ed said about your team, having a managed services division is key. A product set alone doesn’t ensure that your consumers can use that technology well. You need a team to work with them to design the best system. Getting people to change is harder than adopting new technologies.”

Craig Cook, Vice-President for Solutions Architecture and Engineering, Flexential

In addition to the people component, Cook also stressed the importance of keeping up with risk management. “You cannot have anything in technology these days without some kind of cyber or resiliency layer on top of it,” he explained. “Companies are finally getting executive buy-in because of the public relations nightmares that can happen. People are losing jobs when information is leaked. We provide every layer of security you can think of. It’s about making sure your clients know what they don’t know.”

Kimm then shared that IBM has a similar strategy, and even offers a service based on that attitude. “If you need a Chief Cloud architect for example, but you don’t want to hire someone else, IBM can also provide someone for that resource a staff overlay,” he shared. “Not to replace anyone but to be an extension of your team.

Soon, a member of the audience asked a question about disruptive technologies in data center management and operations, such as quantum computing. “Is it like nuclear fusion?” he asked, with a few chuckles from audience members. “Is it here now, or is it always going to be ten years away?”

“It’s real now. It’s happening. It’s about confidence and adoption,” responded Kimm, comparing it to two other major paradigms that have similarly transformed data centers — the Cloud and Blockchain. “These other technologies are no different. You can buy it and it can solve problems, but it’s still early.”

Cook then predicted that such technologies are likely to increase infrastructure requirements, at least initially, before also cautioning that it’s still early days. “The adoption rate will drive it. It’s a very nascent space. There is a lot that we don’t know yet about what we don’t know,” he shared. “I don’t believe there is a tsunami coming. It’s going to be one of the trends that we see and work with. We will have to continue to evolve and adapt.”

This means that having a flexible infrastructure platform will be important. “Take liquid cooling for example — the power requirements to feed them are unknown,” he mused. “We don’t have to predict the future perfectly, but we have to be half a step ahead. Thinking about where those workloads will be, Internet of Things and the Edge are going to play a role in that. Latency and connectivity will play a bigger part than compute.”

As the conversation neared a conclusion, another member of the audience, Everett Thompson, General Manager for the Wired Real Estate Group, asked the panelists to again look to the future for another big buzzword. “Is 5G for real? How will that impact the Edge?” he asked, mentioning that the power industry has been talking about distributed power plants for a long time, but hasn’t yet been able to materialize it.

“It can be for real, but it depends on adoption of use cases,” replied Cook. “5G is a fair way away in terms of ubiquitous deployment…we talk about the “Near Edge and the “Far Edge.” We consider ourselves to be a “Near Edge” provider. You’ve got EdgeMicro and VaporIO with containers dropped at the base of a cell tower. There are challenges with that today. You can’t get 12 KW of power at the base of a shell tower. So, you have to make some trade-offs. Most of the apps to support that probably haven’t even been written yet. The paradigm will always exist to what can be central and what needs to be distributed.”

Finally, the discussion concluded with a question from Klebash about what’s next for the industry as a whole. Should we expect more complexity or some simplicity? To that, Cook had a firm answer. “451 Research recently wrote a report where they compared the Cloud to the complexity of a beehive, talking about how it’s not just simplification,” he mused. ‘Part of the value that comes from the Cloud is the complexity. It sounds backward, but by decreasing complexity, you decrease value. We see consolidation in the industry as players get acquired,, but the number of services they’re releasing isn’t slowing down.”

E-mail me your stories,  industry news tips, and press releases.