Labmanager Logo
Two female scientists are looking at a computer screen on a lab bench. One scientist is sitting at the bench and wearing a blue hijab, while the other is standing and writing something on a clipboard. They are both wearing white lab coats and blue gloves.

iStock

Overcoming Challenges to Machine Learning Adoption and Implementation in the Lab

New universal lab connectivity can pave the way for advanced analytics in the life sciences community

| 4 min read
Share this Article
Register for free to listen to this article
Listen with Speechify
0:00
4:00

Data comes from a huge variety of sources in the lab. The laboratory has lagged behind other industries in its drive to generate this ecosystem of connectivity. The life sciences community must figure out how to bring together the vast amount of disparate data in an accessible and usable way to develop better insights and outcomes. 

This is an exciting time for the industry. Crucial technologies have matured, including IoT, artificial intelligence (AI), computational power, and sequencing. A spirit of digital innovation is now entering pharma and revolutionizing the sector. The scientific community welcomes this disruption and is starting to take a high-tech approach to solve old problems, like connecting the diversity of analytical equipment to enable evidence-based discoveries.

Lab manager academy logo

Get training in Lab Crisis Preparation and earn CEUs.

One of over 25 IACET-accredited courses in the Academy.

Certification logo

Lab Crisis Preparation course

The need for data-driven insights to accelerate drug discovery and development was highlighted beyond all doubt by the pandemic. Laboratory productivity and connectivity were thrown into the spotlight as rapid, global data sharing was urgently needed for effective collaboration 24/7.

A current application of AI, machine learning (ML) is starting to enable advanced analytics for labs. And the good news is that with ML, high volumes of data improve the performance over time. The more data available, the better the systems can be trained to recognize patterns based on previous input—and predict various likely outcomes.

So, what is holding things up? 

One major roadblock to implementing ML is the lack of modern infrastructure to connect all data sources across the lab and enable the free flow of information between multiple endpoints. In this technologically advanced world, there is still a huge disconnect between the sophistication of scientific instrumentation and the ability to gather, integrate, and curate the resulting data. 

Digital transformation of the lab depends on getting experimental data from the wide diversity of sources (instruments, applications, and informatics) into a common location and, critically, within a structured framework. 

Interested in lab tools and techniques?

Subscribe to our free Lab Tools & Techniques Newsletter.

Is the form not loading? If you use an ad blocker or browser privacy features, try turning them off and refresh the page.

Maintaining digital data integrity

It is important to maintain digital data integrity within this framework. There are several factors to consider:

  • Connectivity and business rules

Without universal instrument connectivity and the business rules to govern the flow of data, the life sciences industry will never be ready to fully embrace ML and take advantage of the insights it can provide. The key is to enable automated digital access to processed data from the widely diverse set of instruments and applications in the lab. This can be achieved using a modern cloud-based infrastructure that enables the free flow of data across interconnected laboratory resources. 

Controlling the flow of data from all endpoints brings new value to the creation of the strategic data lakes (repositories) that are essential to feed and train the ML models.

  • Usability

Once the free flow of data is in place, the next essential step is to curate the data into a usable format. The ability to efficiently curate data on the fly and either correct or flag missing or incorrect metadata can dramatically increase the volume of usable data coming directly from the data sources.

  • Data integrity

This approach introduces considerable time savings while maintaining digital data integrity. Transactional data stored from the processes described above can be mined and recovered as needed. This needs to be available for all transactional events within the system. Accordingly, a full chain of custody of all data exchange transactions is available to support a lab-wide data integrity model for data in flight. 

Data curation to support AI

Machine learning can connect the dots between enormous amounts of experimental data. Currently, however, too much time is spent curating data, trying to track down the correct data, and solving issues with that data—such as missing information, incomplete metadata, and mistakes. That time would be better spent using the data for scientific advancement. 

Leveraging the transactional nature of the process described above, data may be inspected and flagged as incorrect on the fly (as long as the business rules are known) and isolated for correction. Data identified in this way can be set to trigger an alert to highlight the need for a review to check its accuracy, or for missing data to be added before that data enters the repository. Furthermore, certain business rules can be stored and applied to the data on the fly. If metadata deviates from the pre-set format provided by its stored rule, that deviation can be detected and the data can either be reshaped or flagged for manual curation.

The ability to efficiently curate the data generated by the huge variety of laboratory instrumentation, software, and applications is necessary for the successful deployment of ML models.  

The role of digital decoupling in digital transformation

Digital decoupling plays an essential role in bringing digital transformation to the lab. Here, Pat Pijanowski, managing director within Accenture’s Scientific Informatics Services Practice and an expert in digital laboratory transformation, shares his views on this critical success factor and the implementation of a digital strategy in the laboratory environment:

“When seen through the broader lens of an overall digital lab transformation strategy, an agile approach to its implementation and the concept of digital decoupling can be crucial. Digital decoupling can break the perpetual cycle of technical debt that has come to plague traditional monolithic laboratory applications and trapped user communities in multi-year cycles of implementation and deployment. The length of these cycles across multiple sites often results in an unsupported version of software in production—thus triggering the requirement to upgrade the system and start the deployment process all over again.

Employing a digital decoupling approach to separating the instrument connectivity layer from higher level enterprise lab informatics systems has the potential to help break this cycle, thus liberating significant resources which can be re-deployed toward driving innovation projects and generating incremental business value for the organization.” 

Implementing a digital lab exchange infrastructure

The future acceleration of science will depend on a strategic, holistic, and flexible digital data exchange infrastructure that decouples the application layer from the integration layer to give users the freedom of choice of lab technology. The ability to exchange data at will between all informatics solutions, instruments, and lab resources is crucial for the AI programs that will support the digital lab of the future.

Outcomes such as faster drug development, improved accuracy, discovering larger molecules, and making way for innovative therapies are only the beginning. Ultimately, universal lab connectivity will enable robust analytics and accelerate the journey toward a new era in life sciences. 

About the Author

  • Dave Levy, Global Product Director at Scitara Corporation

    Dave Levy is global product director at Scitara Corporation. With experience in strategic domestic and international software development as well as in project management and sales within the life science marketplace, Dave is responsible for leading the global product innovation strategy. Over his 30-year career, Dave has held roles at leading life science software companies including NuGenesis Technologies and CambridgeSoft Inc, and most recently at PerkinElmer where he was responsible for leading key strategic accounts.

Related Topics

Loading Next Article...
Loading Next Article...

CURRENT ISSUE - December 2024

2025 Industry and Equipment Trends

Purchasing trends survey results

Lab Manager December 2024 Cover Image