Lab Manager | Run Your Lab Like a Business

Efficiencies of Lab Automation

Louis Scampavia, PhD, associate professor in the Department of Molecular Therapeutics at the Scripps Research Institute in Florida, talks to contributing editor Tanuja Koppal, PhD, about how automation has been a critical part of their high-throughput screening activities. He goes into the details of what can and should be automated and the due diligence that needs to be performed before these decisions are made—decisions that have a long-standing impact on the workings of a lab.

by Tanuja Koppal, PhD
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Q: What types of high-throughput screens do you run?

A: We are primarily focused on high-throughput screening (HTS) and ultrahigh-throughput screening (uHTS), and we use either 384- or 1536-well microtiter plate format for those screens. A little more than half of them are cell-based, and the remaining are traditional, biochemical interaction screens. These screens go across a number of different assay types, from enzymatic to reporter genes, secondary messengers, protein-protein interaction screens, and more. We have the ability to screen up to a million compounds in a 24-hour period, although we rarely do that.

Q: What aspects of the screening are automated?

A: We are only a dozen people in our facility, so everything that we can automate, we have. That includes compound management, where we use robotics for formatting and reformatting plates. We have robots for cherry picking and freezer storage to carefully bring out vials with 2-D barcodes so we can reformulate them into our compound collection. For screening, the plates are all loaded by robots, and robots are also used for compound dosing. The robots deliver the plates to the plate reader for obtaining the readouts. Most important, we have as much integration of informatics with the robotics so that information is obtained in real time. There is some manual intervention involved in preparing the chemicals and transporting plates from the compound management room to the screening facility, but we have tried to make things as automated as possible.

Q: Are biochemical and cell-based assays both equally amenable to automation?

A: Historically biochemical assays have been more amenable to automation, but today both types of assays can be automated. There are certain limitations with cell-based screens that have to do with the cell line and the type of assay being performed. For instance, some cells have to be kept in certain environmental conditions that cannot be maintained during automated dispensing and incubation, or some types of cells can get washed away during the readout. But with the use of proper cell lines, both biochemical and cell-based screens can be done well and efficiently when automated. Anyway, we never take anything to HTS until things have been well developed with pilot screens, on a smaller scale, done off-line. That’s when the issues with automation are worked out and the go/no-go decisions are made.

Q: When automating, should everything be automated from the get-go, or should things be brought in and integrated in stages?

A: The first generation of HTS was a piecemeal integration of instruments. It demonstrated a proof of principle but was not very reliable. We are fortunate to be working with what is the second generation in lab automation, where things are designed from the ground up and everything is reliable and robust. Today if I were to give advice, I would ask people to worry about integrating the informatics. Automation is one facet, but having proper integration of information along the screening pipeline, for quality control (QC) prior to use and instrument auditing, can help prevent the propagation of errors. Whenever informatics can be integrated to provide realtime information to the operator, then the data obtained is more reliable.

Q: What questions need to be asked before investing in lab automation?

A: What people should look for are reliability and robustness, and those factors should drive your decisions. A lot of times, robotics is sold on the concept of versatility, but reliability then becomes the Achilles’ heel. Then you run into cost overruns, having to repeat experiments and analyses. Another way to minimize cost overruns is to minimize human or manual interactions that can lead to errors that are propagated down the pipeline. So in your design, you must have instruments and protocols undergo QC so as to not amplify mistakes during HTS. Automation can fail at times. For instance, you can have dispensing failures, and having the compound plates audited, either visually or using an imaging software such as the Plate Auditor, can help pick out those mistakes prior to screening. During the screen you need proper controls and control plates in place, and all the information should be available in real time in order to catch mistakes early and control costs.

Q: Did you have to make changes to your lab design or to the workflows to accommodate the lab automation?

A: We had to makes changes to both. We have a modular lab design, where even our lab benches are on wheels to accommodate new automation. Our facility is never stagnant, and we are constantly upgrading our capabilities. One recent upgrade is an imaging reader for performing high-content screening (HCS) that can be integrated with our robotics platform. We have to have modularity to upgrade our instrument hardware as the technology improves. We also spend a good deal of time improving our flow of informatics. We recently upgraded our software so the LC/MS QC now automatically updates into the database to provide seamless integration with the results. Now we not only know what efficacy a compound has, but we also can understand the relationship to its purity or any other property identified. Versatility is critical for constant upgrades, and this starts from the time you are designing your facility, knowing how things can change over time.

Q: How should you go about evaluating the right vendors?

A: You should never be too quick to decide on anything. You should develop a relationship with the vendor and see what its instrument can offer and whether it can meet your needs. However, a big part of the cost depends on how well the instrument integrates into the rest of your infrastructure and informatics for automation to go seamlessly. Also it is important to know how well the instrument will be supported in the future, even after it goes obsolete. It’s also important to network with people at other facilities to find out what they have found in terms of reliability and use. Belonging to professional societies is another unbiased way to find out what and where instruments have worked and where they have failed.

Q: What can people expect to see in next-generation lab automation?

A: There is a third generation of lab automation already out there that is focused on versatility and looking at extended capabilities. It’s not necessarily better in performance, but it offers a better range and use. The previous years had focused on a steady increase in screening capacity, library size, and miniaturization, but going forward the focus will be more on content and quality. There is now a downsizing in terms of the size of libraries being screened, and people are more interested in library diversity and its pharmacological potency, incorporating natural product such as compounds. On the automation side, that calls for instruments that can provide greater physiological relevance, such as kinetic assays. There will be more integration of HCS using image analysis and not simply reporters. There may be a greater integration of fragment-based screening with HTS. As in silico screening comes into play, it’s going to force HTS to look at more complicated problems based on physiological relevance.