Lab Manager | Run Your Lab Like a Business

Ask the Expert: How to Automate Your Lab to Best Fit Your Needs

Marc Ferrer, Ph.D., Team Leader at the Chemical Genomics Center, which is part of the National Human Genome Research Institute at the National Institutes of Health discusses the need for automation in various laboratory settings.

Register for free to listen to this article
Listen with Speechify

Marc Ferrer, Ph.D., Team Leader at the Chemical Genomics Center, which is part of the National Human Genome Research Institute at the National Institutes of Health, talks to Tanuja Koppal, Ph.D., contributing editor at Lab Manager Magazine, about the need for automation in various laboratory settings. Ferrer emphasizes the need for using “fit for purpose” automation and addresses the key questions that factor into the decision making. He also highlights the issues that impact the effective implementation and use of automation once the decision has been made.

Q: Why should lab managers consider bringing automation into their laboratories?

A: The first thing that people need to think about is what it is that they are trying to do, and what is the ultimate goal. That is really going to determine why and how much automation they really need. The advantages of automation are obvious in terms of achieving sample throughput, experimental reproducibility, gaining time and reagent savings, all of which translate to cost savings. Those are some of the direct benefits, but there are also some indirect benefits to automation. Once you start thinking about automation, you start thinking about next steps. You think about reagent stability, assay readouts, assay windows and miniaturization, and that leads to the development of more reliable and robust assays.

Q: How do you decide on how much you need to automate?

A: How much automation you need really depends on where you are, whether in industry or academia, in a small lab or a big lab, in an individual lab or a centralized core lab, and what it is you are trying to achieve. If you are in a centralized high-throughput screening (HTS) group in a big pharmaceutical lab, then you probably need an integrated robotic system, with good compound management coupled to a robust Laboratory Information Management System (LIMS). If all you are looking for is increased throughput and you don’t really have to deal with diverse assays and readouts, then you want to think about miniaturization. If you are in a small biotechnology lab where you work on a specific therapeutic area, or if you are in a core academic lab, you may still need an integrated robotic system, but you will also need more flexibility in terms of assay readouts and formats. If you are a principal investigator (PI) in an academic lab, bringing in automation can be expensive, and it would then make sense for a few PIs to get together and jointly buy liquid handling or reader capabilities. You don’t need the equipment to be integrated, because integration comes coupled with LIMS and having expertise and trained personnel to run the robots. So with a few nonintegrated pieces of equipment, you can get the throughput you need without a huge investment.

Q: How do you decide what you need in terms of automation?

A: First define your goals, then talk to the experts about the type of automation you need. Go to automation meetings and conferences, or go to your local HTS groups and explain what you are trying to do and get their advice. Ask lots of questions about what instrument works best for what assays, and then decide what would best serve your needs. We all have our own favorite vendors and equipment, so try to get opinions from several people. Get critical information related to versatility, robustness, technical support and training; this is going to be very important in getting the infrastructure up and running. You have to do your homework and find out what people are happy with, before talking to the vendors and making any kind of investment.

Q: How do you set budgets and priorities and allocate costs?

A: Define your scope, talk to the experts and find out what you are going to need. Once you have set a mandate, you will know what you need but you may not know what it is going to cost. Then go to the vendors and tell them what you are looking for, and start getting quotes on the equipment needed. You can then go back to the source that is funding your investment and tell them what it’s going to cost. It’s an iterative process, with sets of reality checks on what is doable and what is not. If you are in a small lab, work with your core group to see if they can demonstrate what works for you and what doesn’t. On the other hand, if you are in a core group, identify your customers and find out what they need and how you will be working with them.

Q: In your opinion, what are some of the biggest mistakes that people make when it comes to automation?

A: Some people try to do it themselves and build everything from scratch by putting together bits and pieces of equipment with the help of someone who is technically savvy. The problem there is you then spend more time building and maintaining the system than actually using it. So when you identify what you need, leverage the vendors to get you going rapidly. The do-ityourself approach, without the right support and expertise, has led to some of the biggest and most costly mistakes made. Don’t reinvent the wheel; instead, choose a vendor who can provide prompt, affordable service to minimize any downtime. Establishing good relationships with the vendors is critical for the dayto- day running of these instruments, which are quite sophisticated and not always easy to fix. Sometimes, people also forget about the data management involved. There are different sets of tools for data processing, data mining and data visualization, and you cannot build the hardware infrastructure without thinking about how you are going to track and analyze the data. Finally, think about automation beyond your current application and build an infrastructure that can be easily modified for other applications.

Q: How do you plan for downtime for routine maintenance and equipment breakdown?

A: With time and experience, you can estimate how much buffer time you need to build in for preventive maintenance and downtime. In a centralized core lab, you can get historical data to help you plan things out and have good backup systems in place. Having an infrastructure to minimize downtime is critical.

Q: What is the current trend in lab automation?

A: I think the trend these days is toward “fit for purpose” automation. People are not buying a fully automated robotic system just because they can afford to do so. People are thinking early and more clearly about what they need and are building the infrastructure around it. The trend is to buy smaller integrated systems that are more flexible and will allow you to do more biology and more assay readouts. The large, centralized HTS groups may still rely on big integrated systems, but for most other screening groups the flexibility to adapt to different assays is becoming important.

Marc Ferrer received his B.Sc. in Organic Chemistry from the University of Barcelona, Spain, in 1989. In 1994, he earned his Ph.D. in Biological Chemistry from the University of Minnesota, Minneapolis, working on new methodologies for peptide synthesis and studying the molecular basis for protein folding. He then moved to the Department of Molecular and Cell Biology at Harvard University, Cambridge, as a postdoctoral fellow, where he used combinatorial chemistry and phage display methodologies to identify peptides and peptidomimetics that blocked HIV infection. In 1999, he joined the Department of Automated Biotechnology, the Central HTS group at Merck Research Laboratories, developing assays and implementing high-throughput screens for lead and target identification. For the last ten years, he developed extensive expertise in assay development and miniaturization to implement small molecule high-throughput screening in 1,536- and 3,456-well formats for lead identification. He also implemented siRNA HTS for target identification, developing new automation-friendly siRNA transfection protocols, improved data analysis tools for hit selection, and strategies for better on-target hit validation. He also used large-scale chemical genomics approaches for more efficient target and lead identification and pathway mapping in physiologically relevant cellular systems, combining assays with multiplexed readouts with compound/RNAi libraries. Recently, he joined the NIH Chemical Genomics Center, where he is a Team Leader in the Biomolecular Profiling and Screening group, and where he continues to develop and implement chemical genomics approaches using small molecule and siRNA screens to identify tools to functionally probe biological systems and develop new therapeutics for rare and neglected diseases.