Stephen Bustin, PhD, professor of molecular medicine at Anglia Ruskin University in the UK and world-renowned expert in PCR (polymerase chain reaction), talks about the persisting problem of lack of transparency and reliability in current PCR data. Nevertheless, he remains optimistic about the continued use of the technology, with the development of various portable, field-based devices and diagnostic applications, and the advent of digital PCR.
Q: What are the big challenges that we’re still grappling with in PCR?
A: Although conceptually PCR is an extremely easy technique to use, it’s still not as reliable as it can be. We know what the problems are and what the implementation requires, but unfortunately we don’t address these things. Several publications in recent years have indicated that a vast majority of PCR data, at least in biomedical sciences, is not reliable. It’s not just PCR, but a whole range of molecular techniques that are being annoyingly misused, and that causes a lot of problems in interpretation and use of results. This has been the case for a long time, and in recent years it has become worse. Surprisingly, even though there are so many problems with PCR, it still remains a popular technique, because it so very useful.
Q: Does digital PCR bypass the problems encountered with traditional PCR?
A: Digital PCR can complement traditional qPCR for certain applications and, if carried out properly, it certainly solves some of the problems, particularly with data analysis. However, the specificity of primers still remains a problem with digital PCR. The efficiency of the assay, although perhaps more important for qPCR, still remains important for digital PCR, particularly when looking at low copy number targets. The questions around which digital PCR system to use and what the reproducibility and reliability of the data will look like still remain. A lot of people are now attempting to use digital PCR for RNA analysis, and there the problem associated with the reverse transcription (RT) step will always exist. Digital PCR will certainly be more useful for niche applications, and while some problems are reduced, the others will remain to the same extent. If anything, the analysis of the data becomes more important. We published the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines for digital PCR nearly two years ago, but the problem remains that people don’t report what they are doing and how they get their results. Until this issue of transparency is addressed, we are going to have problems, no matter what technique we are using.
Q: What would be your recommendations to lab managers working with PCR?
A: When people publish they should really give enough information for others to repeat the experiment or to judge if the analysis makes sense. How do you select or check the quality of your samples, how do you store and prepare your DNA or RNA, how much quantity are you using for your PCR or RT-PCR, are all questions that need to be answered. All these things which we have discussed in the MIQE guidelines are things every lab should be doing and they should be published at some stage in an online supplement.
Q: Is lack of transparency due to lack of knowledge or lack of interest?
A: Journals don’t care about the lack of reproducibility. Until that changes no one is going to make any effort to be transparent and share all their detailed experimental protocols. The second issue is that most people don’t know what they are doing. They follow the directions in the PCR kits and then look at the results the instruments give, without bothering to sit down and think about what those results mean. As long as they get a “p” value, which I call the “publication” value, and it’s less than 0.05, that’s all they care about. Often you see the wrong primer listed in a publication, or an inappropriate normalization procedure or efficiency numbers that are not achievable. The changes in expression profile that people report are sometimes way beyond the range of what a PCR can detect. Digital PCR may help with this, but overall the quality of PCR data being reported is quite poor. In immunochemistry where we use antibodies, or next-generation sequencing, there is the same problem when the raw data is not made available. Ironically, low impact factor, specialized journals seem to publish higher-quality data since the reviewers seem to know the techniques better and can spot flaws correctly. We have recently launched a new journal that aims to publish technically sound papers by having certain transparency criteria in place.
Q: Won’t this lack of reliability impact PCR data and its use for diagnostic applications?
A: It really is frightening. So many drugs don’t fail and so many results can’t be reproduced because data are improperly handled. Not just PCR data but clinical trial data as well. This is an all-pervasive attitude that has crept into science, and so much investment has been made in techniques and methodology that people don’t want to take a step back and think. People just want to move on and do new things, without stopping to think if it makes sense and how we should change things. We trust scientists and let them self-regulate and it just doesn’t work. Scientists are no less biased than other people.
Q: Are you excited about any of the new developments in PCR?
A: Digital PCR is a very nice way of accurately quantifying low copy numbers of new targets, and we are using it to look at fungal pathogens. With qPCR we are never sure of how much we have and we cannot distinguish the lowest amount of antigens. Now we can use digital PCR to quite accurately quantify viral and fungal load and go down to lower level of sensitivities than we could with qPCR. The move toward nanotechnology is something else that is extremely useful. The emphasis on getting PCR out into the field with handheld devices is also interesting. The combination of fast PCR, tiny volumes, and portable devices is going to give PCR a whole new lease on life.
Q: Do you think PCR will improve as it moves into diagnostics and other highly regulated environments?
A: I have been working with qPCR for nearly 17 years and there have been improvements. As PCR moves into diagnostics it will become more tightly regulated and controlled, and that can only be a good thing. Unfortunately a lot of the diagnostic applications are based on research precursors, and that’s why some of the problems will continue. A lot of pharma and biotech companies may end up wasting a lot of money if the initial experiments are not done properly. It’s remarkable how companies that supply the reagents are incredibly supportive of the MIQE guidelines. Their product application specialists are sometimes a lot better at PCR analysis than are the researchers themselves. Most of these companies have invested a lot in training their people to follow the guidelines, and that’s all very positive.
Q: How important is training in PCR?
A: If I am in charge of a lab, I will make sure the people working there follow certain fixed procedures, and not just for PCR. I will make sure no result leaves my lab without me checking that it’s genuine. Lab managers have a great opportunity and responsibility here. They should go and inform themselves on what needs to be done, and that’s not difficult, since there is so much published on what’s relevant. With PCR, the efficiency and specificity of the primers are universal criteria that are important. Which chemistry you do depends on what you are trying to do, but some fundamentals remain unchanged. For every technique you use there are some criteria that are essential to follow, and lab managers have to make sure the results they ultimately produce are meaningful.
Q: What are some of the resources that lab managers can make use of?
A: Managers should contact people who know what they are talking about and invite them to their labs to survey things, and get their advice. Secondly, I would look at the relevant publications and see what people are proposing. Thirdly, I would try and standardize as much as possible. We know that different instruments and reagents give different results. So find out how a particular reagent interacts with your samples and know what accuracy and problems to expect. It involves talking to people outside one’s own area, getting their input and then implementing some common-sense procedures. There is never really one reagent that works best in every circumstance. So you have to test and see which one works best for your sample and experiment. The preparation of an experiment is more important than actually carrying it out. If your reagents, instruments, and lab are all set up well, then you are more likely to succeed.
Stephen Bustin obtained his PhD in molecular genetics from Trinity College, University of Dublin, Ireland, and is currently professor of molecular medicine at Anglia Ruskin University in the UK. His research interests center around developing novel approaches for the early diagnosis of fungal and bacterial pathogens. He has authored numerous papers, review articles, and book chapters aimed at improving the reproducibility and robustness of real-time quantitative PCR (qPCR), including three books, “A-Z of Quantitative PCR” (2004), “The PCR Revolution” (2011), and “PCR Technology” (2013). He published the first online series of qPCR-related books under the title of “Definitive qPCR” (www.qPCRexpert. com). He led an international consortium developing the MIQE guidelines for the use and reporting of qPCR (2009) and digital PCR (2013). Dr. Bustin has extensive editorial involvements as editor-in-chief, Biomolecular Detection and Quantification (Elsevier), editor-in-chief (Gene Expression), International Journal of Molecular Sciences, and member of the editorial boards of several peer-reviewed journals.