As the director of global strategic alliances at Elemental Machines, Dan Petkanas has years of experience helping life sciences companies future-proof their lab operations and secure their quality functions. He has helped countless companies digitize their laboratory or manufacturing operations, securing data integrity and advancing their reporting and analytics capabilities. Here, Petkanas offers his insights into making your lab more intelligent to streamline operations, improve decisions, and yield better results.
Q: In your opinion, what makes a lab “smart”?
A: I like to look at this from a few different angles. When you look at a lab, there are many different stakeholders involved—scientists, automation teams, quality control, and lab operations, just to name a few—who all need to work cohesively. So, a lab needs to be smart, not just in its design, but also in how equipment is used and how the personnel flow throughout the space. This approach focuses on the overall optimization of the lab so that scientists can perform their work effectively.
For this to happen, lab operation teams need access to quality data to drive spatial planning and operational decisions. Whether it’s scheduling, calendar, equipment utilization, or room occupancy data, all this information feeds into a strong operational foundation.
Smart labs, at their core, allow scientists to do high-quality, repeatable work with minimal interruptions and little to no manual data entry. To achieve this, labs need well-connected systems in place from the beginning. Everything from the equipment they use to their LIMS and ELN systems, along with their scheduling and business intelligence tools, must communicate and work together, pulling and sharing data.
Q: For lab managers looking to build a smart lab, where should they begin, and which elements should they prioritize?
A: I think for starters, lab managers need to look for platforms that can grow and scale as the needs of their lab change. It’s pretty easy in the beginning to use brute force to have these smaller systems work together—whether it’s a data collection system pulling information from equipment like mass specs or HPLCs and sending it to a LIMS or data lake or using a stand-alone equipment monitoring system. However, labs need platforms that are robust, scalable, and interconnected with the multitude of solutions their stakeholders rely on. For example, the information that comes off a piece of equipment will be used for vastly different purposes by both quality teams and scientists, so using a platform that supports both of these groups is key when you’re looking to build a smart lab.
Having these growth-ready systems in place also allows lab managers to focus on more important work and the more important questions that are coming out of the lab, rather than dealing with issues like, “Why isn’t this equipment connected?” or “Why isn’t the data getting sent?” It also avoids inconveniences like getting up from their desk and walking into the lab to troubleshoot equipment.
Q: What are the common challenges labs face when implementing smart technology, and how can they overcome them?
A: This ties back to my previous point about looking for platforms, not just point solutions. Obviously, there’s a multitude of technology out there for labs. Some of these systems require hands-on setup, others are more intrusive to get going, and some silo their data more than others. Certain systems are also better suited to specific types of labs, whether they’re chemistry, biology, QC, or R&D labs. The needs change based on the type of science the lab is doing.
We see labs starting out in all shapes and sizes. Not everyone sets up a brand-new lab with brand-new equipment. A lot of times, it’s a move or an expansion, meaning you have a mix of old and new equipment, or your methodology may require a certain manufacturer for a specific part of the protocol or process. These labs can quickly start to become disjointed and disconnected. The most common issue we see with customers who are unhappy with their current technology is that they’ve simply outgrown the initial systems they put in place. These systems weren’t designed to fail—they were put in place to solve a problem they had years ago that may not be relevant today.
Using a platform in the middle that can connect old equipment, new equipment, and equipment from different manufacturers enables lab managers, scientists, and quality teams to access the information they need quickly, easily, and in a scalable way.
Q: How does predictive analytics shape the development of a smart lab?
A: Overall, predictive analytics allows labs to be much more proactive in their operations, rather than reactive and constantly fighting fires. The downstream effect we see from this is that the science gets finished more efficiently and at a higher quality.
We mostly see this in two areas. The first is a little bit more visible through our AI-predicted freezer health scores. With this, we can alert lab managers weeks before a catastrophic event is likely to occur by identifying abnormal freezer behavior. This gives them time to investigate, move samples to a different freezer, or service the equipment before any damage is done. The ability to expand this to all lab assets offers lab managers even more predictive capabilities. If we can tell you that equipment A, for example, is running differently than it did three weeks ago, you can potentially prevent a major failure before it happens.
The second area where we see predictive analytics making a big impact is experimental workflows. If the lab is set up to automatically capture and connect all sorts of experimental data—whether it’s equipment performance, lab conditions, or storage conditions—you can really start to see trends much faster. If you’re able to predict in the middle of an experiment that it’s not going to be a worthwhile effort because, for example, the storage conditions of the sample were wrong or the equipment wasn’t properly calibrated, you can actually stop that experiment early, saving both time and resources.
Q: What unique opportunities do recent AI and data science advancements offer labs?
A: I think a lot of people immediately think of applications like drug discovery, biomarker targets, analyzing large datasets, or other things of that nature. Our focus, on the operational side of Elemental Machines, is how labs and their operations can use AI.
We’ve seen that AI is really good at identifying patterns (much, much better than we are). These patterns can be things like how and when the lab is used. For example, is it busy on Tuesdays or on Monday afternoons? Do scientists have the materials they need to do their work? Do they experience more issues with an experiment when it’s run at a certain time of day, or when using a particular piece of equipment?
A lot of this information is often stored anecdotally with the scientists or lab manager, but if this information is collected digitally and automatically, we can leverage AI to analyze and identify these patterns, helping drive science forward. More and more AI and data science tools are being used to identify patterns in equipment performance. AI is much better at finding and analyzing these patterns and relaying them to lab managers, allowing them to optimize their lab operations.
To learn more, visit elementalmachines.com