Labmanager Logo
A line of laboratory robots with red indicator lights on the arms

iStock, kynny

Beyond Orchestration: The Evolving Role of Schedulers in Lab Automation

How advanced scheduling technologies boost efficiency and reliability

| 5 min read
Share this Article
Register for free to listen to this article
Listen with Speechify
0:00
5:00

João Bessa Pereira is a product manager at Automata, leading the development of LINQ Cloud, the software component of LINQ—the complete lab workflow automation platform. His focus is on developing software that is user-friendly, secure, and fit for the diverse scientific environments it serves. Over the past year, João has concentrated on creating a new kind of automation scheduler for the platform.


Headshot photo of João Bessa Pereira

João Bessa Pereira, product manager at Automata

Credit: João Bessa Pereira

Q: Can you explain the role of schedulers in lab automation solutions? 

A: Schedulers often take center stage when discussing lab automation tools because they play a crucial role in facilitating walkaway time and keeping everything on track. On a basic level, they optimize workflows and ensure they’re delivered successfully by determining the most efficient order and timing of workflow tasks. However, they bring many more benefits beyond planning executions.

Fit-for-purpose schedulers should, for example, include deadlock prevention, enable time constraints and conditional decision-making, optimize throughput, and overcome run errors.

Ultimately, schedulers should enable labs to execute more automated workflows, reduce validation and run time, generate fully contextualized data, and operate automation platforms remotely and reliably.

Q: What types of schedulers do you usually see in conjunction with lab automation?

A: Most labs have moved on from simply tracking experiments on paper or with spreadsheets. We now see two popular types of schedulers in use: static and dynamic.

Static schedulers make decisions based on known constraints, so you have to plan ahead, considering all dependencies and variables upfront to determine the correct order and timing of workflow tasks. These schedulers typically lack the functionality to change the course of execution during the run in response to task-tracking events. However, they do offer predictability and a stable plan for workflow delivery in controlled environments.

Dynamic schedulers adapt and optimize workflow delivery in response to changing conditions and unexpected events. They continuously monitor and adjust workflow tasks to maintain efficiency and ensure successful completion. Their high degree of flexibility makes them ideal for environments where conditions can change rapidly and unpredictably.

Our platform, LINQ, features a new kind of advanced scheduler with what we’re calling “dynamic replanning” capabilities. It allows workflows to be planned and validated upfront for optimal resource planning and transparency, while also responding to unpredictable events during execution to keep the schedule on track. Additionally, it includes advanced error detection and recovery capabilities.

Q: Could you tell us more about “dynamic replanning scheduling”?

A: Effective lab automation should alleviate the workload on personnel without adding new risks. By combining the best bits of static and dynamic schedulers, we’ve developed something that is easy to use and facilitates real walkaway time, allowing users to automate workflows confidently. This approach emphasizes reliability with dynamic error handling and offers reassurance through advanced planning features, reducing downtime and increasing both throughput and reliability.

To explain what our dynamic replanning scheduler does, we like to use a Google Maps analogy:

Google Maps will offer you the most efficient route from point A to point B, depending on the variables you select. It will also automatically reroute your journey and update your estimated arrival time if you take a wrong turn or your road is blocked, allowing you to continue without stopping to replan your route. This is the level of adaptability and efficiency that the LINQ dynamic replanning scheduling engine aims to achieve.

Giving scientists the freedom to plan their workflows and the reassurance that errors won’t halt their progress allows busy, highly skilled lab workers to really trust the automation, giving them more time to focus on other critical tasks.

Q: What motivated the development of this new kind of automation scheduler? 

A: Over the past ten to 15 years, automation goals have largely focused on throughput and turnaround time metrics. Customers were looking for automation solutions that focused on optimizing one specific process, increasing their speed or scale.

Now, we’re seeing biotech and pharma companies wanting more flexibility. They need to automate different workflows on the same system or automate increasingly complicated tasks, all while maintaining reliability and maximizing walkaway time.

With that kind of advanced automation now in focus we’re developing our platform to improve metrics that align with R&D environments. This includes focusing on the speed at which workflows can be built and validated, the adaptability and expandability of the platform, increasing successful workflow completion rates, and ensuring real-time delivery of contextual data to complement experimental results.

We knew a more advanced scheduling engine was needed to deliver something that would meet the ambitious goals of highly innovative R&D labs, without compromising on the planning needs of others. We also understood that error handling and recovery, agnostic data integration, and advanced orchestration capabilities would be key to instilling confidence in the platform and its ability to manage workflows as well as a human could. So, we built all these features into our software development pipeline.

Q: Why is robust error handling such a crucial part of scheduling? 

A: In the early stages of lab automation, many solutions became obsolete because they would fail upon encountering errors; error handling wasn’t baked into the solution. Now, expectations are higher than ever. Failed workflows cost time and money and stifle innovation, so error handling and recovery are expected rather than preferred.

Error handling is a feature typically found in dynamic schedulers, so we’ve taken that as one of the “best bits” and built on it. Our scheduling engine considers time constraints, known conditionals, and data transfer events, mapping the state of each work cell to replan and find a valid new schedule. It alerts operators to issues remotely, records and transfers all event data in real-time to the cloud-based interface for interrogation, and recovers workflows to a safe state. This ultimately prevents deadlocks and ensures that nothing is lost, even if the workflow can’t be rerouted to successful completion.

Q: Beyond scheduling, what other trends are emerging in lab automation, and how do you think they will influence future developments? 

A: Two key themes are taking center stage for our customers right now: data and AI. While these topics have always been part of our discussions, they’re now being addressed much earlier in the conversation.

Lab automation has facilitated large-scale generation of clean data by removing throughput restrictions, and global connectivity has given researchers access to reams of data from a variety of sources. However, not all this data was collected with scientific purposes in mind, and even that generated in-lab can be missing metadata that would make it valid or interoperable.

Machine learning models need these large volumes of data to be trained, which is why quality, context, and volume are so important. Moreover, as AI and machine learning are integrated into more lab processes, they will continue to learn independently, provided we can supply them with clean and contextualized data.

The potential of AI in the lab is vast. When it comes to therapeutics, for example, AI-enhanced automation supports precision medicine, leading to more targeted and effective treatments delivered faster and to a larger portion of a varied population.

In R&D, in-silico experimentation and closed-loop iteration cycles driven by automated, integrated data flows make innovation fast-paced and easier to validate.

High-quality data is generated almost as standard with lab automation; how this data will be delivered, standardized, contextualized, and accessed has now become the focus for us and our customers.

To get us to a place where smart laboratories, connected via the cloud, leverage AI to tackle more complex challenges, we must first solve the issues of interoperability and reliability in our automation. As automation platform developers, this is the future we have in mind and the trends shaping our product roadmap.

Find out more about the complete lab automation platform LINQ by Automata at www.automata.tech

Related Articles

CURRENT ISSUE - October 2024

Lab Rats to Lab Tech: The Evolution of Research Models

Ethical and innovative scientific alternatives to animal-based R&D

Lab Manager October 2024 Cover Image
Lab Manager Analytical eNewsletter

Stay Connected with Analytical News

Click below to subscribe to Analytical Tools & Techniques eNewsletter!

Subscribe Today