Lab Manager | Run Your Lab Like a Business

The Right Water

Laboratories use large volumes of water to fuel the vast majority of daily research, analysis, and processing applications—from autoclave feeds and buffer preparation to molecular biology, analytical analyses, and cell culture work.

by Jim Keary
Register for free to listen to this article
Listen with Speechify

Intelligent Water Quality Decisions are Vital to Doing Great Research

Using the limited data available, we’ve estimated that a laboratory will consume approximately 35 million liters per year.1,2,3 However, in stark contrast to this is the relative lack of attention often paid to the quality of this water beyond utilizing distilled or deionized water for sensitive experiments.

The impact of unseen contaminants

There are several major groups of potential water contaminants, and they can affect the generation of reliable and accurate data in different ways.

Dissolved compounds

Dissolved compounds come in numerous varieties. Dissolved inorganic compounds (e.g., metal ions like sodium, calcium, and iron, or compounds like nitrates and salts) make up the bulk of water contaminants and will primarily affect proteins, interfering with protein solubility as well as the successful formation of protein-protein and protein-lipid interactions. Inorganic compounds can also affect PCR, as DNA polymerases are highly sensitive to various common cations (e.g., Cu2+, Fe2+, Ni2+), which can disrupt substrate binding and inhibit enzyme activity. When immunohistochemistry (IHC) is performed, several metal ions can cause unwanted precipitation reactions when they are present at high enough concentrations in staining solutions, or they can even interfere with antibody-antigen binding reactions.

Dissolved organic compounds, which are typically biological in origin, can lead to the promotion of bacterial growth. This can affect multiple processes such as IHC (via bacterial release of alkaline phosphatase), molecular biology (via bacterial release of nucleases), and general hygiene levels of equipment (through biofilm establishment). They can also reduce the overall sensitivity of analytical techniques such as high-performance liquid chromatography (HPLC) by competing with the analyte in the mobile phase, reducing the effective levels of analyte retained in the column. Blotting techniques are also at risk from organic molecules, as stray negatively charged molecules can interfere with the hybridization process by binding nonspecifically in place of DNA or RNA, upsetting the hybridization procedure.

Finally, dissolved gases can cause problems by creating ionic instabilities—for example, carbon dioxide will be absorbed in water to form carbonic acid, which can alter pH; this, in turn, can reduce the capacity of anion exchange resins. In addition to this, the solubility of air in water will directly affect the concentrations of both oxygen and nitrogen, variables that can alter the rates of certain biochemical reactions and impact the reproducibility of results. High concentrations of these dissolved gases can even result in bubble formation; bubbles have the potential to disrupt spectrophotometric measures and can impede media flow through microchannels and columns.


The bacterial contamination of water can lead to the generation of numerous possible errors, particularly when carrying out microbiology and molecular biology/genetics. In addition to macroscale contamination in the form of biofilms, bacteria release both endotoxins and nucleases. Nucleases will break down any nucleic acids (such as DNA or RNA) present in a sample, while endotoxins can affect cell culture and/or in vitro fertilization techniques. Many bacteria also release alkaline phosphatase (AP), which can interfere with IHC protocols making use of AP for chromogenic detection. Various microorganisms can affect biochemical reactions by competing with substrates at enzyme active sites, compromising multiple lab assays and reactions.

Free-floating (planktonic) bacteria can trigger the formation of biofilms on surfaces that, once initiated, can continue to develop for several years. The presence of biofilms on lab surfaces represents an ongoing source of contamination since the bacteria continue to be released at sporadic intervals (along with the associated endotoxins and nucleases discussed previously).

Suspended matter

Particles suspended in the water column can include anything from biological vegetation and silt to colloidal matter and pathogens adsorbed onto other particles. Such factors can have obvious detrimental effects; for example, by blocking filters, chromatography columns, or osmosis membranes.

During HPLC, particles and colloids can also result in an elevated back column pressure, which may affect pumps and thus impact the overall integrity of the system. Their presence when using sensitive techniques like spectrometry and spectroscopy can lead to the generation of inaccurate data, primarily due to poor calibration by distorted blank or working samples.

Different types of water

Type I+  18.2 < 5.0  < 1.0  < 0.03 
Type I > 18.0 < 10.0  < 1.0  < 0.03 
Type II+ > 10.0  < 50.0   < 10.0   N/A
Type II > 1.0  < 50.0   < 100.0   N/A
Type III > 0.05  < 200.0   < 1000.0   N/A

Water purification processes have advanced significantly beyond simple filtration methods. In order to deliver ultrapure levels, the water is subjected to a multistage process. The use of techniques such as reverse osmosis, ion exchange, electrodeionization, and ultrafiltration allows for the production of high-caliber water with physicochemical properties that can be accurately measured in order to produce consistent, referenceable standards of water quality (see tables 1 and 2).

Type III This is the grade recommended for non-critical work, which may include glassware rinsing, water baths, autoclave and disinfector feeds, as well as environmental chambers and plant growth rooms. These systems can also be used to feed Type I systems.
Type II Type II water is employed for general laboratory use. This may include media preparation, the creation of pH solutions and buffers, and for certain clinical analyzers. It is also common for Type II systems to be used as a feed to a Type I system.
Type II+ This is the grade for general laboratory applications requiring higher inorganic purity afforded by standard Type II water.
Type I Often referred to as ultra pure, this grade is required for some of the most water-critical applications such as HPLC4 mobile phase preparation, as well as blanks and sample dilution for other analytical techniques including Gas Chromatography (GC), Atomic Absorption Spectrophotometry (AAS) and Inductively Coupled Plasma Mass Spectrometry (ICP-MS). Type I is also required for molecular biology applications as well as other sensitive techniques such as mammalian cell culture and IVF (in vitro fertilization).
Type I+ Goes beyond the purity requirements of Type I water and is used in processes requiring the highest levels of water purity.

Many labs will also make use of distilled and/or double- distilled water, which is produced via a slow, energetically intensive distillation process. Although this method has the benefit of producing water with a long shelf life, water purified in this way is still prone to recontamination if it is stored for overly long periods of time.

Choosing the right water quality

The types of water that have been defined allow one to easily discern which water is most suitable for the experiment in question. They also allow users to ensure that the selected quality is consistent throughout the duration of the research being conducted. Understanding which water to use in which instance will not only result in more robust data generation, but could also be beneficial when it comes to managing the research budget effectively. By making appropriate and intelligent decisions about the quality of water being used during different types of experiments and lab processes, it becomes easier to produce accurate and precise results. The end result is the generation of reliable data that lends itself to subsequent downstream analyses.

From the information covered here, it should be apparent that selecting the right level of water purity is of great importance and should form an important part of the planning of any experiment. Water purity can have substantial effects on the levels of data accuracy and precision, and with analytical equipment becoming increasingly more sensitive, selecting the right water purity is becoming more vital to doing great research. To find out more about water, the most widely used reagent in the lab, see our new infographic (reproduced here).


1. European Commission (DG ENV0). June 2009. Final Report, Study on water performance of buildings. Available at:  [Accessed June 19, 2014].

2. Good Campus Guide, S-Lab Briefing 5: Reducing Water Consumption in Laboratories. Available at:  [Accessed June 19, 2014].

3. University of Oxford and AECOM Ltd., University of Oxford Water Management Strategy Report. January 2011. Available at: [Accessed June 19, 2014].

4. Whitehead, P. 1998. Ultra-pure water for HPLC. Why is it needed and how is it produced? Laboratory Solutions, December issue.

5. Based on decades of experience obtained by the water purification experts at ELGA LabWater.

6. Lane, A.N., Arumugam S. 2005. Improving NMR sensitivity in room temperature and cooled probes with dipolar ions. Journal of Magnetic Resonance, 173(2), pp. pp.339-343.

7. Rezaei, K., Jenab, E., Temelli, F. 2007. Effects of water on enzyme performance with an emphasis on the reactions in supercritical fluids. Critical Reviews in Biotechnology, 27(4), pp.83-95.