menu

“What Guaranties Can You Give Me?” and Other Questions for Cloud Providers

Apr 13, 2018
by Josh Anderson

CHICAGO, IL — Security and compliance are huge factors in decisions about compute and consolidation. Just ask Raymond Parpart, the Director for Data Center Strategy & Operations at the University of Chicago, whose residence in a university setting provides for a unique perspective on this issue. At CapRE’s 2018 Chicago Data Center Summit, he shared some insight into the issues he has seen affect big-pictures data center decision-making.

“We have a tremendous amount of very confidential data,” Parpart began by revealing. “We’ve recently built a cluster for Project Moonshot, which is Vice-President Biden’s pet project to cure cancer. His cluster and research sits in my data center. The principal PIs working on that are part of the university but along with that are an enormous amount of HIPAA regulations.”

data center summitAccording to Parpart, one of the major difficulties that the university faces is at-risk data.”We’ve recently, for the first time, as a university as a whole, brought legal, the hospital and the university together to realize that we had to address data loss,” he shared. “We don’t want to end up on the cover of the Chicago Tribune because we gave up all of the demographics of the students at public schools because someone hacked our system. Oh and by the way, we do have all of that data.”

When UC researchers are working on data, said Parpart, the litany of requirements behind it do go beyond HIPAA or FERPA or PCI. “The list just goes on and on,” he mused. “So putting things into the cloud? That’s not the cloud’s market. They’re not really interested in that type of data yet. They want the data because they want to mine it, but we can’t let them mine that data.”

“As we look at secure data enclaves and other compliant data sets, Azure and AWS are now coming along and realizing that they can get research in big chunks of data,” he offered. “We’re talking petabytes of data, and pushing the exobyte level on some of our clouds. We think we’re going to be at half of an exobyte of data next year for a couple of our projects. So when you put that in the cloud, and you’re running heavy-duty research against that, with high performance computing, now the industry is starting to see that there’s a little more money that they can make.”

Raymond Parpart, Director for Data Center Strategy & Operations, The University of Chicago

Parpart is no stranger to shopping around different clouds and cloud providers. “As we’ve looked looked to keep it inside, it’s really about, how we can do it in the safest, quickestt and the easiest way, to protect me and protect the data,” he shared. “I’m going to be very selfish, but I don’t want to go to jail and I don’t want to get a ticket. I worked for General Motors and we actually got threatened by the State Department, and that was a very bad day.”

“Those are the types of things that really come to mind when we look at cloud providers,” he continued. “What are the guaranties you can give me? And some of them can’t even guarantee that it’ll be inside the country. They’re going to move your data any data center, anywhere they need, depending on loads and capacity.”

Luckily, that paradigm is changing, but not without cost. “You can get those agreements now, but you pay for that,” he explained. “So again we come back to what it’s going to cost me. If I want to guarantee that my data is going to stay in the continental U.S., I have to pay for that guarantee. If I want it to be in a particular region of the United States, I have to pay a fee. To the earlier question about costing and how you cost this, well you’ve got to be very cautious and careful about what all of the components are.”

Check out a previous CAPRE Insider Report covering this panel:

 

Sign Up For Updates:

Categories