George Weinstock, Ph.D., professor of Genetics and Molecular Microbiology at Washington University, talks to contributing editor Tanuja Koppal, Ph.D., about what a lab manager can do to stay abreast of changes in the rapidly evolving field of next-generation sequencing. He shares his knowledge and years of experience evaluating new instruments to guide users on what they should look out for as they strive to meet budgets while increasing productivity.

Q: What is your advice to lab managers working in a field such as next-generation sequencing, where everything is so dynamic and always in a state of flux?

A: This has to be one of the most difficult fields to be a lab manager in. It is so fast moving, which means that your options are changing all the time in terms of instruments, reagents, and accessories for everything from sample prep up front to downstream data handling. So for the person who has to make decisions in the lab, you’re faced with either having to make a large capital layout or to train staff to focus on a particular application. And given that in one to two years there will be new applications that will make what you’ve just invested in a little bit of a dinosaur, you will constantly be fighting these kinds of issues and decisions.

Some of the more recent next-generation sequencing instruments now have a much smaller price tag than in the past. I think that direction is likely to continue as long as there’s competition. But if you calculate how much it costs to do sequencing, in dollars per base pair, sometimes lower-cost instruments are more expensive to run. The equipment outlay is lower, but the daily operating cost may be higher. So there are a lot of tradeoffs and there’s not a simple answer or formula to deal with that. One thing that’s most important to avoid is being too influenced by the marketing. Things get “hot” and everybody seems to want to go after those technologies. The traditional experience has been that instruments do eventually reach the goals that are described, but it usually takes considerably longer than you’re led to believe.

Q: How long does it take for a technology to mature?

A: I would say typically for almost every new instrument that we get it takes about 12 months for it to really become stable and become operationally what it was originally described to do. During that 12-month period, one is dealing not only with tweaking the engineering of the instrument itself, such as taking the hinges off one side of a door and putting them on the other side because they were creating a lot of problems; but more often, it’s subtle things to improve the stability and the performance of the machine, such as tweaking a laser or putting in a different kind of pump. And during this period of time, usually the protocols are changing all the time too. It’s almost never that the protocol you end up following 12 months after you get an instrument is the same protocol that came with the instrument originally. That means the assays kits change, and you can be in a position of having bought some reagents that are no longer useful. There are a lot of ups and downs during that first year, and so one of the main messages in a field that moves very, very quickly is to keep your eyes and ears open and keep track of everything that’s going on. You can be assured that the first time you hear about something it’s about a year away from when you really have to start thinking about making your decision to buy.

Q: How do you recommend making decisions in terms of buying instruments and reagents?

A: The tough decision is when you have instruments that are seemingly going to perform similar tasks. How do you choose which one to get? Assuming that the instruments are equally stable and there are no obvious differences in cost-benefit analysis, I think very often the decision can rest on very subtle things. On the surface, it may look like both instruments might suffice; but in fact, each type of data is a little different. There are subtle differences in the amount of DNA that’s required, how clean it has to be, the types of errors in the sequence that you get may be subtly different for the two different platforms, and so forth. That’s really for the investigators you work with to be mindful of and to help you as a lab manager make the right decision. It may turn out that, in fact, certain instruments are going to be significantly more appropriate for a particular application.

Q: What would you say to users who have not upgraded their sequencing technology in years, either because what they have works or they’re not sure when and where to make their investment?

A: It’s very difficult to see how using old technology, particularly Sanger technology, is sustainable. It is literally thousands of times more expensive and very limited in terms of the amount of sequencing you can do. So any research institution that wants its researchers to be competitive, in terms of the grants they put in, absolutely has to upgrade. For a new facility that is trying to decide what to do, I still think that there are many common applications that existing platforms are going to be good for—for at least another two or three years—before they’re replaced by something else. If you’re a new operation, a great deal of what you need to learn is the fact that these instruments produce so much data. You’re usually having to deal with large numbers of different samples, bar-coding them, pooling them together, and running and deconvoluting them. Being able to do proper sample and data management is going to be needed no matter what instrument you invest in and will be translatable into any of the platforms that come out in the future.

Q: What are you most excited about in terms of the changes in this field?

A: Well, I think it’s the reduction in run time. You sacrifice some of the unit cost, the dollars per base pair is a little higher, and also the total amount of data produced is a little lower, but there are many applications that are much better suited to the new lower-cost instruments that have shorter run times. I think that’s a very interesting development. If that continues to evolve over the next couple of years and you get a greater range of the amount of data that you can produce and the cost continues to come down but the run time stays short, that will have a big impact on a lot of different applications.

Q: What has your experience been when working with the vendors? Is this a market where you can go to a vendor and watch a demonstration, or can you actually test-drive the instrument for a few weeks to see whether it’s the right fit for you?

A: We’ve had pretty good luck with vendors, but we’re a big customer. So I don’t know whether we are given special treatment or not. But often when we get a demo instrument, it’s going around to a lot of different sites, so I think all that is possible. The key to taking advantage of that is to have your samples and your applications very well defined and ready to go so that for the limited time that you have an instrument, you can get the most out of it. We’ve usually found that the vendors are very pleased to work with us because we’re very organized. We learn a lot about the instrument by being organized and really putting a lot of effort into it for the time that it’s with us.

Dr. George Weinstock, Ph.D., is the professor of Genetics and Molecular Microbiology at Washington University. He applies highthroughput DNA sequencing, genome-wide analysis, bioinformatics, and other genetic methods to the study of human, model organisms and microbial genomes. His goal is to employ genetic and genomic thinking to important problems in biology. He led one of the first bacterial genome projects, sequencing Treponema pallidum, the causative agent of syphilis. He is now a leader of the Human Microbiome Project, studying the collection of microbes that colonize the human body. The goal of this project is to analyze the genomes of these organisms, characterize the communities they form, and measure how communities change in different health and disease states. Dr. Weinstock was one of the leaders of the Human Genome Project and also the first personal genome project, sequencing Dr. James Watson’s genome using next-generation sequencing technology. He was previously co-director of the Human Genome Sequencing Center and professor of Molecular and Human Genetics at Baylor College of Medicine. He received his B.S. degree in biophysics from the University of Michigan and his Ph.D. in microbiology from MIT.