Lab Manager | Run Your Lab Like a Business

Ask the Expert

Drs. Swerdlow and Simpson on Trends in Lab Automation

While there are many advantages to automating certain workflows in the lab, a lot of thought must be put into when and where automation can be best used

by Tanuja Koppal, PhD
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Contributing editor Tanuja Koppal, PhD, talks to Harold Swerdlow, PhD, vice president of sequencing at the New York Genome Center, about the use of automation in sequencing, while Kaylene Simpson, PhD, associate professor and head of the Victorian Centre for Functional Genomics (VCFG) at the Peter MacCallum Cancer Centre in Australia, discusses the use of lab automation in her work involving cell culture and high-throughput functional screening. Both agree that while there are many advantages to automating certain workflows in the lab, a lot of thought must be put into when and where automation can be best used.


Q: What are some of the challenges in next-generation sequencing?

A: Some of the challenges in sequencing are associated with extraction of DNA and preparation of libraries for sequencing, or with handling the data that comes after the sequencing. The library preparation is the part that is manual, takes a lot of time, and is prone to mistakes. In terms of automation, we have identified the steps that are the trickiest for a technician to do. We have exclusively used liquid-handling robots to replicate what a bench scientist would do, which is transferring liquids using pipettes in a way that is more accurate and easier to scale. We have always focused on scale, using 96-well microtiter plates rather than individual tubes. We focus on scalable automated robotic systems that integrate well with our laboratory information management systems. This helps us keep track of samples, reagents, instruments, and assays in a way that’s faster and more accurate than people can. There are upfront costs with automation, but that pays back very quickly.

Q: How much of your work is currently automated?

A: We still do some projects manually when the number of samples processed is small, or when the technique varies only slightly each time. Automation depends on the number of samples, the workflow, how many steps there are in the protocol, and whether those steps vary. For instance, with RNA sequencing, the protocol varies quite a bit depending on how many samples there are and how they are processed, which makes it a little trickier to automate. We have automated some of the capture steps and are working toward automating more. On the other hand, with whole-genome sequencing, there are only two protocols that we use, and there are kits provided by manufacturers. We process about 20,000 samples per year and use a straightforward protocol; therefore, whole-genome sequencing is fully automated using robotic systems. On a sample basis, nearly 90 percent of our work is fully automated, since a vast majority of our samples are for whole-genome sequencing projects. However, on a project basis, only 60 percent to 70 percent of our work is automated, as there are a lot of smaller projects that are not automated at this time.

Q: How do you decide when and what needs to be automated?

A: It’s important to start with automating processes that are the hardest to do or ones that are mission critical, and then work from there. People might be tempted to start automating steps that are easiest to do, but it’s best if the automation investment goes toward projects or protocols that have the biggest impact. It’s also important to try to not automate things that humans are typically good at. For instance, in some labs you see a lot of sophisticated equipment with long articulated robotic arms for picking tubes and plates or for moving stacks in and out of the refrigerator or freezer. People can do those tasks cheaply and efficiently, and accuracy can always be checked if the vials and plates are barcoded. I personally do not think it’s worth spending enormous amounts of money on robotics for freezer management and such, unless you are working with a million samples a year. People think that they have to automate every single part of a protocol, but they should really be thinking about automating only those parts that are very routine and prone to errors. None of our protocols are 100 percent automated. We have dedicated modules to perform different activities like DNA prep, PCR [polymerase chain reaction], quality control, and such, but we move plates in and out of these modules ourselves.

Harold Swerdlow, PhD, serves as vice president of sequencing at the New York Genome Center. He is responsible for developing cutting-edge technologies and managing the clinical and research production facilities. Prior to joining NYGC, he served as head of research and development at the Wellcome Trust Sanger Institute. Dr. Swerdlow was the chief technology officer at Dolomite Ltd. and the senior director of research at Solexa Ltd. He also served as a unit coordinator and director of the Microarray Core Facility at the Center for Genomics Research at the Karolinska Institute in Stockholm, Sweden.


Q: Can you give us an idea of the types of projects that are pursued at your center?

A: The Victorian Centre for Functional Genomics at Peter MacCallum Cancer Centre was established in 2008 to enable genome-wide RNAi [RNA interference] screening to researchers across Australia. As a core technology platform, we see a great diversity of projects, with studies around understanding cancer signaling, tumor cell motility, drug sensitivity and resistance mechanisms, and synthetic lethality gaining high priority. We also have a large focus on human health, in the area of host-pathogen interactions, performing RNAi screens to identify vaccine candidates for treatment of Hendra virus infection, Leishmania, Coxiella, Legionella, and many others. When the lab was established, the majority of screens were cancer-focused synthetic lethal studies measuring cell viability via a luminescence reagent using automated multimode plate readers. These days, automated quantitative phenotypic high-content imaging is the mainstay of the lab, with an extensive array of different assays developed, including cell morphology and motility, quantitation of membrane expression and localization, identification of host cell invasion by different pathogens, quantitation of the extent of DNA damage foci formation, cell proliferation, and senescence.

Q: Which of these projects or what aspects of these projects require automation?

A: All projects in our laboratory utilize liquid-handling automation, both small personal workstations and plate washers, and large mainframe robotics. Our RNAi and compound library plates are all managed using these automation platforms, including hydration of lyophilized reagents, aliquoting into daughter copies, cherry-picking single targets, reformatting of plates from 96- to 384-well format, and, of course, the actual RNAi transfections and delivery of compounds to assay plates. The smaller liquid-handling workstations are essential for dispensing cells, lipid/opti-MEM complexes, drug dosing, media changes, and fixing and staining steps. Basically anything you might do in the lab we now do in high throughput. All our image analysis is also fully automated through the instrument-associated software, and we have a bioinformatics pipeline that is customized on a user basis for determination of hit molecules. We also house a reverse phase protein array [RPPA] platform that utilizes a nanoplotter instrument to automate the dispensing of picoliter volumes of protein lysate to quantitate native and phospho-protein expression. Imaging the RPPA chips is fully automated and software-based analysis is included with the chip reader.

Q: What do you see as the pros and cons of using automation?

A: I love everything about automation, except when it breaks down. Although that happens rarely, it’s always a major interruption, and generally you did nothing different from yesterday! Over time, we have built redundancy into our platform to ensure capacity to service both the large number of projects and the emergency backup. Having redundancy is critical but can often be an expensive investment that may not be well utilized. We have three personal liquid-handling workstations, but only one mainframe robot. As one preventive measure, we purchased a second tip-holding mandrel that does all the specific tip liquid handling, and that has saved us several times already. Our platform is very modular to enable many different users to be in the lab at the same time. Automation vastly accelerates everything we do and significantly improves experimental accuracy and reproducibility. Automation is expensive to set up and great when you get it, but then over time, it becomes outmoded. However, the financial resources to buy new systems are hard to come by once the laboratory is established. The fully integrated systems have many advantages and enable more of a “set it and forget it” mentality, and, of course, reduce the opportunity for operator error during the course of the screen run. However, in a core platform like ours that supports a large number of researchers running their own experiments, a fully integrated system would greatly decrease our capacity to offer the level of access we currently have.

Q: Are you noticing any new trends in lab automation and cell culture?

A: In the high-content imaging arena, we are moving rapidly toward confocal-based imaging. This is particularly important as more researchers are using 3-D cellular models that more accurately represent the in vivo cellular state but are much harder to image and quantify. The instruments that can quantitatively image cells in suspension are going to greatly change the ability of researchers working with suspension cell cultures, liquid cell biopsies, and any non-adherent cell types to perform discovery-based screens. We would love to see an affordable system that would do the mundane, routine cell passaging. Performing large-scale screens is a lengthy process that requires great attention to detail. Passaging cells and preparing them for screens in a highly reproducible manner, essential for low-variability data, is quite tedious. New cellular scaffolding systems and different types of plates for 3-D screening are also being developed, and I believe this area is going to continue to grow rapidly. Methods to improve the throughput of such systems in basement membrane substrates will be a game-changer for the cancer signaling field in particular.

Kaylene J. Simpson, PhD, is an associate professor and head of the Victorian Centre for Functional Genomics ACRF Translational RPPA Platform at the Peter MacCallum Cancer Centre in Australia. The VCFG enables researchers across Australia to perform genome and boutique-scale small interfering RNA, short hairpin RNA, microRNA, and long noncoding RNA screens and small compound screens, with an emphasis on quantitative phenotypic high-content imaging.