Trendspotting: What’s Coming For Bio-IT In 2021

January 7, 2021

January 7, 2021 | When we spoke with the Bio-IT World vendor community, leaders reported working hard to synthesize what 2020 brought us and apply those learnings to 2021. As they each made predictions in their own business areas, some themes arose.

Artificial intelligence and automation came up again and again. “While a theme for years, AI in the lab will become more commonplace,” said Bob Voelkner at LabVantage Solutions. Even as there will be very public AI failures along the way—"video deep fakes, maligned bots, not really autonomous cars, etc.,” said Tom Chittenden, Genuity Science—"It is vital healthcare not to get tarred with the same brush.”

Accompanying the expected increase of AI and automation solutions, are needed upgrades to data management, storage, and lab infrastructure. “To realize AI at scale… requires a modern data infrastructure that re-imagines the role of data and how it is used,” said Josh Gluck of Pure Storage. The options are many, with cloud solutions, data fabrics, and various storage options emerging as contenders.

If 2020 showed us anything, it is that speed matters and the companies equipped to access and act on data and turn around solutions quickly have a competitive advantage. “Large pharma companies are looking to increase their competitive edge,” said Ashu Singhal with Benchling. They “realize [that] how fast they can take in new scientific technologies that fit into their data management infrastructure matters.”

Singhal predicts more partnerships and acquisitions by big pharma, but even within smaller players 2020 has fostered a collaborative atmosphere, many mentioned. “The fight against COVID-19 has been prioritized across the globe,” said PerkinElmer’s Arvind Kothandaraman. “In that sense, pharmaceutical and biotech, which are conventionally considered competitors, have joined together to work towards the same goal... The collaboration has been unprecedented, and we’ll see this approach continue in many ways moving forward.”

And to bring us full circle, Intel’s Stacey Shulman sees AI and federated learning as particularly crucial to accelerating and streamlining collaborations, “making it easier for healthcare professionals to deliver quality care to their patients as well as stay up to date on new treatment options.”

Here are the full trends and predictions including additional forecasts for lab agility, supply chain management, data sharing, storage options, and choosing drug targets. –the Editors

 

Ashu Singhal, Co-Founder and President, Benchling

Pharma will expand beyond just being a product and transition into service organizations. Pharma is being forced to figure out how to not just produce a drug product, but also how to deliver it. For example, Pfizer’s vaccine will have to be transported at -80C and Fauci recently said that 75% of the country will need to take the vaccine for it to work. Delivering the vaccine is a logistical challenge that they must overcome.

Large pharma companies will increase their partnerships and/or acquisitions of scientific technologies from SMBs and academics. Large pharma companies are looking to increase their competitive edge and realize how fast they can take in new scientific technologies that fits into their data management infrastructure matters. We’ll see more acquisitions like the recent one by Gilead of Immunomedics, an ADC company, at a price many analysts thought was crazy but likely because of competitive pressure

More R&D organizations will continue moving their infrastructure to the cloud because of COVID-19 and ML investments will rapidly increase. The increase in ML learning is even larger in pharma due to the excitement around AI-based drug discovery in the biotech industry.

 

Bob Voelkner, Vice President, Sales and Marketing, LabVantage Solutions

Leverage advanced analytics with focus on AI and ML. While a theme for years, AI in the lab will become more commonplace, as advanced tools for dashboarding and visualizations – while sophisticated – make it possible for users to explore, interrogate, and analyze data without asking IT for help.

Increased focus on LIMS platform hardening to defend against cyber threats. As the lab becomes more integrated into the enterprise IT ecosystem, our job is to ensure the integrity of the LIMS platform against external threats.

Greater demand for Cloud and SaaS LIMS, especially validated SaaS. Customers in regulated industries like pharma and food and beverage are interested in the benefits Cloud hosting and SaaS offers for reducing upfront capital expenses and speed of deployment, making validated SaaS essential in 2021.

Continued strong desire for purpose-built and turnkey LIMS solutions. Customers have lost their appetite for lengthy, custom, complex implementations. Systems that are built for purpose and easily integrated into the ecosystem will continue to win the day.

Organizations continue to execute on their Digital Transformation plans. Digital Transformation is a key strategy driving a standardization enterprise-wide LIMS to support business harmonization, standardization and paperless operations in the lab.

 

Renen Hallak, Founder and CEO at VAST Data

Healthcare is the first industry to go all-in on flash. They learned a hard lesson this year as they raced for the researching, testing, manufacturing, vaccinating, deployment, and calculating answers that the world needed. The problems with tiered storage show up most prominently at scale with analytics. These critical data sets now have a value that is proportionate to their size, which throws the value of storage tiering out the window. 2020 proved that our front-line systems could not deal with the latency of mechanical media, and low-cost flash price points are now compelling enough that organizations no longer need to choose between performance and budget.

 

Kevin Cramer. CEO and Chief Scientist, Sapio Sciences

We are seeing a continuation of a trend from 2020, of life sciences companies looking to consolidate information technology solutions such as LIMS, ELN and SDMS. The digital lab is becoming extremely important to enable high throughput, agility to changing lab technology and utilization of the new advanced analytics tools such as machine learning. Having disparate and/or inflexible systems is a non-starter for going digital, hence this change in direction from bottom up decisions on software to more strategic top down driven initiatives.


Arvind Kothandaraman, PerkinElmer

Collaboration: Collaboration among scientists is the backbone of labs. The fight against COVID-19 has been prioritized across the globe, and this has accelerated how all organizations work in a united effort to ultimately serve the public. In that sense, pharmaceutical and biotech, which are conventionally considered competitors, have joined together to work towards the same goal. Information sharing will help ensure the abundance of testing kits and therapeutics for everyone and everywhere. The collaboration has been unprecedented, and we’ll see this approach continue in many ways moving forward.

Digital technologies: In modern laboratories, digital tools are essential. With the enormous amount of data generated and the ever-increasing number of tests being run, labs could not function without them. Technologies including automation, artificial intelligence and machine learning are evolving and improving every day, which will go a long way toward supporting labs in their efforts to combat pandemics like the one we’re in now. Their ability to conduct high content screening and generate information-rich data from cellular samples faster expedites processes and improves throughput enabling labs to meet requirements in real-time. With demand for laboratory services growing exponentially, these tools will be key in delivering robust, traceable solutions and assuring data accuracy with minimal user interference.

 

Stacey Shulman, VP, IoT Group and GM, Health, Life Sciences and Emerging Technologies, Intel

One of the things that is currently holding the healthcare industry back is standardizing medical records and data sharing across organizations. Collaboration in the medical industry for the purpose of solving illness and health issues can be critical, especially when it comes to public health crises and tracking population health, as we have seen with the pandemic this year.

In 2021, we will see improvement in the delivery models for information sharing, as emerging technologies such AI and federated learning become more ubiquitous in healthcare. In addition to powering innovations like telehealth, these technologies will accelerate and streamline the collaboration process, making it easier for healthcare professionals to deliver quality care to their patients as well as stay up to date on new treatment options.

 

Adam Marko, Director of Life Sciences, Panasas

Discovery from imaging data will require a different kind of IT. Researchers are gaining access to troves of rich data thanks to advancements in exciting imaging technologies such as Cryogenic Electron Microscopy (CryoEM) and Lattice Light Sheet Microscopy (LLS). But these research pipelines require a data storage foundation that delivers high performance in a reliable, scalable, and adaptable way. Forward-thinking pharmaceutical organizations are thus switching scale out network attached storage (NAS) systems for parallel file systems. Parallel file systems shine with these imaging workflows . Unlike traditional NAS file systems, parallel systems allow researchers to perform a wide range of analysis against much larger datasets (such as images plus genomics). That is essential for CryoEM and LLS. CryoEM has already shown promise in pharmaceuticals, and the potential for LLS is growing. With the help of parallel storage infrastructures, that potential is far more likely to be met.

 

Kendall Clark, founder & CEO, Stardog

The reality of digital transformation is that the majority of most “data-driven” efforts are doomed to fail, primarily because machines are not humans! Human decision-making is based on contextual intelligence, and in order to successfully automate, machines need to know what we know. One technology that is helping organizations address this need is an enterprise knowledge graph (EKG), a modern data integration approach that allows organizations to discover hidden facts and relationships through inferences that would otherwise be unable to catch on a large scale. EKGs make knowledge not just machine-readable, but machine-understandable, by capturing real-world context from disparate data sources on a specific topic, person, project, etc.

Data fabrics are heralded for their ability to weave together existing data management systems, enriching all connected apps. They are considered the next step forward in the maturation of the data management space. Data lakes once held the promise of centralizing an enterprise’s assets but failed to make the data usable. The data warehouse is, in fact, even less capable than data lakes since they only admit structured data to begin with, leaving the semi-structured and unstructured data silos completely disconnected. Data catalogs have since emerged to provide an inventory of the bewildering diversity of their data landscapes, only to be faced with the next great challenge; how to make this data usable and reusable at enterprise scale?

 

David Sprinzen, Director of Marketing, Vantiq

COVID reminded the world of the need for solutions that improve the efficiency and safety of plant operations. Applying those successfully will be critical for Pharma 4.0 to reach its potential. Fortunately, more labs are deploying networks of IoT devices and sensors that can collect information from their environments, allowing companies to respond in real time to any changes in health conditions or manufacturing operations on the plant floor. Similarly, the myriad pieces of sensory data generated by this smart network will reveal a lot about the work environment. It might be about a particular machine’s status or conditions related to the broader business environment. The information gets contextualized, combined, and correlated so that the organization can apply best practices and truly operationalize its data.

Facility managers will be able to detect and respond to potential trouble in minutes, not hours, leading to the creation of a safer, more efficient manufacturing environment.

 

Paul Moxon, SVP, Data Architecture at Denodo

Extreme Automation: The pandemic has accelerated the digital transformation process for many companies, and for many this has meant to try to automate every process and part of their businesses as much as possible. Companies are replacing their legacy human-driven business processes with digitally enhanced workflows and work processes to achieve much greater business agility and reduce dependency on human participation, that may have limitations due to safety regulations and policies. Systems, supported by AI/ML, can improve over time. With automation software and cloud orchestration services, companies are pushing limits of automation to not only better function during the pandemic but create the foundations for fundamentally more efficient approaches to manage their business.

 

Rafael Rosengarten, CEO, Genialis

2020 and COVID required a collaborative approach to science in unprecedented ways. In the second half of 2021, when we return to more in-person-type interactions, the over index on collaboration will be a good thing. I think that folks are going to be really eager to continue to amplify this collaborative approach to science. In addressing COVID-19, the standing up of working groups and consortia saw a lot of sharing in a pre-competitive way that is going to have to keep going. COVID is a big wake-up bell for a lot of us. What I’m most excited about looking ahead is now linking the various kinds of silos or workgroup functions in drug discovery and development and thinking about how we start to connect these.

In AI, we have breakthrough technologies in generative chemistry, in coming up with new molecules and figuring out where they hit and target discovery; and where Genialis mostly works around biomarkers, in the latter half of the drug development lifecycle, from translational medicine into the clinic, through clinical trials, and into the market. The question then becomes how do we think about building systems, frameworks, and technology to link all of those activities and AI solutions, and to what extent are the links themselves going to be based on artificial intelligence? In 2021, it's about connective tissue, it's connecting people, connecting data. It's thinking about these different AI application areas, which are now ripe enough that we should have the connection to one another and feeding-back to one another.

 

Tom Chittenden, Chief Data Science Officer, Genuity Science

AI will take its lumps in other industries. It is vital healthcare not to get tarred with the same brush. Although AI is expanding our understanding of biology at the cellular level, there will be failures along the way in other sectors (video deep fakes, maligned bots, not really autonomous cars, etc.). But AI is helping uncover life-saving discoveries (as we noted in a recent aortic aneurysm paper) and improve disease treatment research, data-collection analysis and more. By classifying human disease based on cellular data—like what is done with cancer—we can leverage this for other diseases. This approach is “disease-agnostic,” and can help advance our understanding of human biology.

 

Jeffrey Gulcher, Chief Scientific Officer & Co-Founder, Genuity Science

More drug targets discovered using genetics will be launched into clinical trials: COVID has changed the trajectory of drug development; the industry has shown they can do it faster. But choosing the right human genetically-validated targets for drugs at the start will only help the clinical trial process. Why? Better targets at the start could save millions of dollars downstream. If you get it wrong at the start, it is wrong throughout the process.

 

Josh Gluck, vice president global healthcare technology strategy, Pure Storage

Hospitals must defend the data first and foremost to ensure operational continuity. There’s a triple threat facing U.S. hospitals right now—the uptick in COVID cases; DHS, FBI, and HHS’ joint warning of imminent and increased ransomware risk; and a higher overall security risk during the election transition. These factors further challenge health organizations to support clinicians, serve patients, and protect their most important data. To defend against this massive triple threat, hospitals not only have to double down on backups but focus on their ability to restore data rapidly.

The appetite for faster time to science is voracious and will likely continue. The world’s scientific community continues to break records in the fight against COVID-19 – leveraging massive information sharing that is leading to a more accurate picture of COVID-19 and accelerated development and testing of vaccines and therapeutic treatment candidates. We’ve seen what can be done faster than ever before imagined. Health sciences organizations across the board seek to build on this momentum safely and effectively to further accelerate the pace of personalized medicine. Genomics and AI are key to this quest. To realize AI at scale, however, requires a modern data infrastructure that re-imagines the role of data and how it is used.

 

Janardan Prasad, CBO and Head of Life Sciences at Lore IO

Self-serve data platform for life sciences analytics: 2021 will be the year where most life sciences companies would not just own their data platform, but also make their teams more agile and self-serve because of it. The need for agility and self-serve has always been top of mind for emerging pharma organizations, but 2020 emphasized the need to handle the market uncertainties around product launches and sales operations even more. Moving into 2021, unified data platforms powered by common data models (CDM) and advancements in AI and cloud technology will enable self-serve to business and IT teams. Self-serve data platforms will make it easy for business analysts to manage data and business rules to maintain a single source of truth and offer business agility to the clinical operations, medical affairs, and commercial teams.

 

Brad Hamilton, Founder and Chief Science Officer, GoodCell

The pandemic has laid the groundwork for a surge in cell-based therapies. The pandemic demonstrates how rapidly research can respond to and develop therapies. This, paired with the overarching shift toward the concept of using one’s own biology to treat illness—self treating self—is leading to an uptick in cell-based treatments. In just the first three quarters of 2020, regenerative medicine set a new annual record of $15.9B in financing, up 242% over the previous year. This trend is likely to continue as a result of the accelerated workflows spurred by the pandemic. Coming out of the pandemic, we’re going to see an even greater emphasis put behind cell and gene therapies for a variety of conditions, most notably in cancer. Advances in life science technologies—with greater sensitivity, precision and intelligence—are enabling a deeper study of disease. As our ability to evaluate the mechanisms of disease with more granularity and on an individual patient level increases, the better we’ll be at identifying new and more personalized ways of addressing them. The years ahead will see tremendous strides in genomics, particularly around cell free DNA and single cell analysis, which will be crucial to earlier detection of disease as well as developing new therapies for diseases currently untreatable.