Has Bio-IT Soured On Cloud?
Editor's Note: This article was corrected to accurately represent Don Barber and AstraZeneca's position.
By Joe Stanganelli
January 17, 2018 | For years, the world of bio-IT was reticent when it came to cloud solutions. The very nature of cloud computing—making data and workflows more easily accessible—appeared antithetical to the kind of hardcore data protection philosophies and practices that populate sectors as highly regulated as healthcare and life science. Life scientists wanted the ease and accessibility of a cloud platform, but their organizations often could not abide the data-stewardship risk.
Then came "cloud-in-a-box" appliances that brought the ease of cloud platforms on-prem, and got the vertical more interested in the accessibility of cloud. (See, The Turnkey Appliance Revolution Evolution in Genomics.) Cloud adoption—and evangelism—took off from there.
Cloud Forecasts Get Gloomy
Cloud computing was gaining a reputation as an enabler of fast and cheap science—hard to resist for the modern bioinformatician. In a workshop at the Bio-IT World Conference & Expo 2016, Ton van Daelen, senior product director of ScienceCloud (an externalization platform under Dassault Systèmes' BIOVIA brand), observed that "There [are] three types of customers who want to go to the cloud:"
1) Those who want to accelerate innovation,
2) Those who want to lower the total cost of ownership (TCO) of their IT solutions, and
3) Those who want to realize increased agility.
Recently, however, industry pundits have been less positive, even bitter, about their prospects in the cloud. As healthcare and life-science organizations have increasingly gotten onto the bio-IT cloud bandwagon, more problems have risen to the surface.
"I think cloud computing is still a big trend, but I think people are underestimating the value of in-house computing." Mike Dimitruk, an account manager at Red River, told Bio-IT World at last year's Bio-IT World Conference & Expo. "Cloud computing is not a panacea."
As for the cost and agility issues?
"Not all that data can be done in the cloud and be done efficiently," said Dimitruk, citing issues of transfer speed. "Time to market for [life-science companies] is greater [in importance] than low-cost storage."
At the 2017 Bio-IT World Conference & Expo, David Sallak, Vice President of Industry Marketing and CTO of M&E at Panasas, agreed—saying that both time to market and time to publication are of paramount importance in the life-science industry. (On the latter point, one needs look little further than the ongoing CRISPR patent kerfuffles.) The common "pay-per-drip" model of most cloud computing SLAs, however, runs counter to the grant-based model of most clinical research, according to Sallak. As the cloud adjusts for these fixed-use cases, some form of dialogue and customized agreement are necessary in the bio-IT space, Sallak said, "so the entity that has a grant doesn't hit a stop sign and can't do their research."
Who’s Worse At Finances?
These cost concerns are not even particularly new.
"It turns out IT guys are really bad at finances," said Don Barber, an enterprise-computing infrastructure architect at AstraZeneca, in a 2016 Bio-IT World Conference presentation about best practices for enterprise cloud adoption. "I think we're going to get lots of pressure to adopt the cloud, even if it isn't cheaper."
Chris Dwan, independent life-science IT consultant and perennial presence at the annual Bio-IT World Conference & Expo, does not think the real concern is really one of cost.
"I think that a lot of what [is being described] as 'pressure' and 'having the cloud forced upon us' is driven by that very reasonable feeling of uncertainty of 'What happens to me if we're going to so radically empower the community to provision their own services and we're not going to go through the old rigmarole that we used to?'" Dwan told Bio-IT World in a recent interview. "It can be threatening to people's professional identity, and that's one of the most close to home of these entities that we carry around. What is my value to the organization? Who am I?"
Indeed, for Dwan, the issues that critics point to are more about the upheaval that cloud adoption brings to traditional roles. Specifically, Dwan counters that it is not "IT guys" who "are really bad at finances"—but that the root of the problem lies in clinical researchers and bioinformaticians being "able to spend what is historically IT's money." Thus, without an upfront organizational discussion of this phenomenon and its consequences, "horrifying" scenarios can occur, says Dwan.
"I worked with an organization who committed to an all-cloud data-storage strategy, but they screwed up their math and they found themselves in a situation where they had to spend their entire budget every month to prevent the cloud provider from deleting their data," said Dwan, likening the situation to one where the organization effectively created its own ransomware. "[Modern central IT departments] are more in the business of guardrails and safety belts than of provisioning perfect systems now… If you find yourself as an IT person giving what would have historically been finance and budget advice, then you're on the right path.
The issue boils down, Dwan says, to one of central IT "empower[ing]" the groups that it services so that they can move forward with purchasing their own solutions and not mess up too badly. After all, cloud salespeople—outsiders—may be too self-interested or aggressive to drive this mindset for the customer organization.
While describing how AstraZeneca was able to effectively adapt public IaaS for its purposes, Barber gave frank criticisms of cloud-vendor "hand waving" about efficiency and compliance obstacles to cloud adoption.
"What happens then is that people become evangelists," observed Barber. "And [then] there is 'one path to the cloud.'"
That kind of inflexibility has undeniably alienated one major customer—the US federal government.
Feds Question Cloud Utility
In a CIO keynote panel discussion at this past year's Bio-IT World Conference & Expo, panelist Andrea Norris, CIO of NIH and Director of the Center for Information Technology, presented a blunt assessment of cloud usability from the public-sector perspective, given that cloud-industry standards don't exactly align with public-sector preferences.
"One of the things that we're struggling with and challenged by—but determined to make a lot of progress in—is how to leverage commercial cloud platforms in a way to curate these very, very large datasets," said Norris, "and make them findable, accessible, …and useable in an easy way[.]"
The one-size-fits-all approach of many commercial cloud vendors has been a bad shrimp in the federal government's craw for some time. The National Institute of Science and Technology (NIST) and other government agencies have long expressed collective dissatisfaction with cloud solutions—to the point of advocating for utility-like regulation of the entire industry. In a session at the prior year's Bio-IT World conference, A NIST representative, Michaela Iorga, at once talked up cloud vendors' capabilities and criticized their potential to deliver, to the federal government's satisfaction.
"Those cloud providers… have the potential to concentrate technology," said Iorga, "probably better than what we have in house."
At the same time, Iorga talked about cloud vendors' "need to meet the government requirements," explicitly calling cloud computing "a utility". Saying something, however, does not make it so. These sentiments are in line with the highly critical invective NIST published in its US Government Cloud Computing Technology Roadmap, in which NIST lamented the federal government's present inability to tame cloud providers to its liking by way of regulations that only apply to utilities.
This context adds clarity to the federal government's wishful thinking on a cloud utopia.
"We spend more than a billion dollars on computational infrastructure on our campus," said Norris. "These are expensive investments… but we really are not able to harness the value of it… You talk to any cloud vendor [about your needs] and they'll tell you 'Absolutely that can be done,' but it can't."
Trenchant Criticism from Dagdigian and Company
In his perennial "trends from the trenches" talk at last year's Bio-IT World Conference, BioTeam Founding Partner and Director of Technology, Chris Dagdigian, built on Barber and Norris's criticisms of vendors.
"I hate when vendors play fast and loose with the truth," griped Dagdigian, complaining that vendor evangelism has failed to fairly represent real-world clinical-research situations—leading life-science organizations to "devolv[e] into a hybrid cloud compromise solution."
At the Pistoia Alliance's Annual USA Conference earlier this fall, attendees pointed to cloud infrastructure, scalability, and abstraction of services as key sticking points in their own commercial organizations' IT.
"The cloud is a positive and a negative," Mary Donlan, Executive Director of Market Development at PerkinElmer, told Bio-IT World in an onsite interview at the Pistoia meeting. "If you're just looking at [merely cloud] data storage, it's actually cheap."
For privacy concerns and scalability, however, Donlan and peers noted that—collaborative benefits aside—the cloud is not perfect.
"[I]n the ELN world, it will be the place where data is stored and aggregated… [but] clearly Europe is lagging behind a bit and adoption of clouds," Donlan added, referencing the EU's notorious reputation for data-privacy sensitivity and the upcoming GDPR. She continued, however, that localized data centers—to the extent that they are available, workable solutions—make such concerns readily solvable.
"[Organizations] have found ways to be [cloud] compliant," Gerhard Noelken, a European business development executive for the Pistoia Alliance, later told Bio-IT World at the same event. "I think the biggest problem is not where you store the data but the quality and consistency of the data. We don't have good data standards… with the right ontologies."
"The automated workflows are generating more data faster than a human ever could," said BioTeam senior scientific consultant Asya Shklyar in her panel with Dagdigian. "That model in the cloud is becoming more prevalent on prem—or should be."
"Cloud is a capability play and not a cost play; even if you have the hardware on prem you just can't fit the software," added Shklyar's fellow BioTeam senior scientific consultant Aaron Gardner. "Really unifying and simplifying what we do allows us to make room for what's next [because] things are getting more complicated, not easier."
Gardner had opened the panel session by declaring that he doesn't "really like cloud," but went on to note that he had accepted cloud computing as remaining a reality for researchers and bioinformaticians—like it or not.
"I think that over time people will put two and two together [and that] workflows will start to be done on prem with the same level of software engineering that we have in the cloud," said Gardner. "It's just going to be a matter of time. I don't know if the cloud providers are going to [facilitate this]."
For his part, Sallak maintained that because customer perception of the cloud has changed in the industry, cloud providers must adjust.
"[The bio-IT cloud] cannot be sensitive to time," said Sallak. "It has to be sensitive to discovery."
Editor’s Note: Get the latest look at the cloud for bio-IT and drug discovery at the Converged IT and the Cloud program at the 2018 Molecular Medicine Tri-Conference in San Francisco, February 11-16. Chris Dwan and a host of other experts will share best practices and the latest technologies enabling bio-IT research.