Skip to Content

Exploring Different Types of Pure Water for Laboratories

27 October 2025 by

Water is one of the most vital resources on the planet, serving as a fundamental medium for all life forms. But its importance extends far beyond biology. Water also plays an indispensable role in scientific research and industrial processes. However, even trace amounts of impurities can compromise results and undermine the integrity of experiments and operations. This makes purified water critical for processes that demand meeting stringent standards. But not all pure water is the same. Various types of pure water are used in laboratories for multiple purposes, and this blog post will guide you through it all.

What is Pure W​ater?

Water naturally contains a range of different substances, including ions, organic molecules, bacteria, particulates, and other contaminants. High-quality, pure water is developed from extensively purified water that is processed to eliminate such impurities, with nearly all soluble substances removed. A key indicator of pure water is its high resistivity, which demonstrates its ability to resist the flow of electrical current. Other characteristics include low levels of total organic carbon (TOC) and conductivity with minimal levels of colloids, dissolved ions, and gases.

What Are the Different Types of Pure Water in Lab​oratories?

According to the American Society for Testing and Materials (ASTM D1193-06 standards), the main types of pure water that are commonly used in laboratories are categorised as follows:

Type I: Ultrapure Water

What it is
Type I water is considered the gold standard for highly sensitive analytical processes. Also known as ultrapure water or reagent-grade water, it is the highest-quality and purest form of water available for laboratory use. 

Specifications 

  • Resistivity: ≥ 18.2 MΩ.cm
  • Conductivity: ≤ 0.055 μS/cm
  • TOC: < 10 ppb

Common Applications

  • Cell and tissue culture
  • Molecular biology techniques 
  • Gas chromatography (GC)
  • Inductively coupled plasma mass spectrometry (ICP-MS)
  • High-performance liquid chromatography (HPLC)


Type II: Pure Water for General Laboratory Use

What it is

While Type II water is not as pure as Type I, it still maintains a high level of purity, making it suitable for standard tests and procedures that do not require a higher level of purity. 

Specifications 

  • Resistivity: ≥ 1 MΩ.cm
  • Conductivity: ≤ 1 μS/cm
  • TOC: < 50 ppb

Common Applications

  • Electrochemistry
  • General spectrophotometry
  • Microbiological media preparation
  • Flame atomic absorption spectroscopy (FAAS)


Type III: RO Water

What it is

Often referred to as primary grade water or RO water, this type is produced using several treatment methods, including reverse osmosis (RO) and carbon filtration. While these processes remove the majority of contaminants (99%), the purity level remains lower than that of Type I and II water. RO water is typically used for non-critical laboratory processes and can also serve as feedwater for the production of Type I water.

Specifications 

  • Resistivity: >4 MΩ.cm
  • Conductivity: <0.25 μS/cm
  • TOC: < 200 ppb

Common Applications

  • Washing and rinsing glassware
  • Heating baths
  • Media and buffer preparation


Type IV: Feedwater

What it is

Type IV water, also known as semipure water, contains several common contaminants, such as microorganisms and dissolved gases, that make it unsuitable for critical laboratory applications. However, it is frequently used as feedwater in the production of higher-purity water types.

Specifications 

  • Resistivity: ~0.2 MΩ.cm
  • Conductivity: ≤ 5 μS/cm
  • TOC: No specific requirement or standard

Common Applications

  • Pre-treatment feed
  • Washing glassware
  • Other non-critical laboratory tasks


American Society for Testing and Materials Standards for Laboratory Reagent Water


Other International Classifications

Several other global institutions have also established specific standards for water quality. In 2006, the Clinical and Laboratory Standards Institute (CLSI) moved away from the conventional Types I, II, and III terminology and introduced a ‘fit-for-purpose’ approach to water quality. They particularly emphasise Clinical Laboratory Reagent Water (CLRW), with brief outlines on Special Reagent Water (SRW) and instrument feedwater. The International Organisation for Standardisation (ISO 3696:1987), on the other hand, classifies the water quality from Grade 1 to 3, with Grade 1 being the purest. These are differentiated based on maximum conductivity, silica content, and the amount of organic matter in the water. However, it is essential to check the instrument or manufacturer specifications, as clinical analysers often require CLRW or device-specific water quality instead of standard ASTM Type II water.

Water Quality Standards for the International Organisation for Standardisation 

How is the Purity of Laboratory Water Assessed?

The purity of laboratory water is typically assessed by measuring the following key parameters:

Conductivity

This is an important water quality parameter as the conductivity of water increases with the presence of pollutants, such as dissolved salts, minerals, and organic compounds. Pure water has a very low conductivity due to the absence of free ions. Simply put, the purer the water, the lower its conductivity. In addition, higher water temperatures can cause charged ions to move faster, thereby increasing the water’s ability to conduct electricity.

Resistivity

Water resistivity, as the term suggests, measures how strongly water can resist an electrical current. A high water resistivity indicates a very low ionic content, which in turn signifies a higher water purity. Since Type I water has very few ions, it resists the flow of an electrical current. This is key for applications where even minute amounts can compromise the integrity of the results.

Total Organic Carbon (TOC)

Certain industrial applications, such as those used in the pharmaceutical and electronics industries, require maintaining consistency in research and manufacturing processes to comply with regulatory standards. Measuring the total organic carbon (TOC) plays a crucial role in this process, as it can help assess the level of organic contamination. This ensures the water used is suitable for the laboratory application and free from any contaminants that could compromise the process.

Biological Contamination

This is a common issue in untreated water, and water purity is key for critical cell culture and clinical applications. Biological contamination can not only skew results and compromise research, but it can also pose serious risks to the health of lab personnel and the environment. Various biochemical and culture-based tests are typically used to assess biological contamination. In some instances, it may also be monitored by epifluorescence testing to distinguish between living and dead organisms, as well as ATP measurements to assess for microbial activity. 

Colloids Presence

Often present in laboratory water, colloids are fine particles (~1 nm to 1 μm) that can cause turbidity. Colloidal fouling is the process by which colloid particles accumulate and interfere with the treatment system, resulting in reduced water productivity and compromised product quality. These suspended particles can also clog filters and need to be filtered out to ensure purity. The Fouling Index, also known as the Silt Density Index, is used to evaluate the potential for membrane blockage. While there are many methods for colloid removal, the most common ones are mechanical filtration, coagulation-flocculation, and ultrafiltration. 


Why Does Water Purity Matter in a Laboratory?

As mentioned previously, impurities have a significant impact on the accuracy of laboratory work and can also affect the equipment itself. They can cause errors in chemical reactions, resulting in false positives or negatives, and inhibit enzymatic reactions that can affect experimentation. Equipment damage resulting from the buildup of contaminants is a common issue that can lead to frequent replacements and revenue loss. Purity ensures that water serves as an inert solvent rather than a reactive one, resulting in several benefits. 

Benefits:

  • Reduces contamination (prevents interference with biochemical processes)
  • Protects laboratory equipment
  • Supports critical laboratory applications (e.g., reduces signal noise in analytical devices)
  • Maintains data accuracy (for trials and experiments)
  • Ensures compliance with industry standards

What Are the Most Common Pure Water Treatment Technologies? 

While there are several different water purification technologies in Australia, here’s a breakdown of some of the most commonly used methods:

Filtration

This technique uses physical barriers, such as sand and gravel filters, and/or chemical absorption (e.g., activated carbon) to remove impurities from different types of water. Filtration offers numerous benefits, including low operational costs and ease of use. However, it can only remove specific physical and chemical contaminants.

Reverse Osmosis

This advanced filtration method uses high pressure to force water through a semi-permeable membrane with an ultra-fine pore size. Offering an exceptional filtration efficiency, reverse osmosis removes up to 99% of total dissolved solids (TDS) and impurities. However, it also generates a large amount of wastewater and incurs higher costs for membranes and pretreatment systems.

Deionisation

Deionisation technology removes dissolved ions, such as salts and minerals, from water by using specialised ion-exchange resins. These resins work by attracting positively charged cations, such as magnesium (Mg²⁺) and sodium (Na⁺), with negatively charged anions, like chloride (Cl⁻) and sulphate (SO₄²), and exchanging them with hydrogen (H⁺) and hydroxide (OH⁻) ions. These ions then combine to form pure water (H₂O). As this resin regeneration involves chemical use, it is associated with higher costs.

Ultrafiltration

This water purification method removes contaminants, such as suspended solids and other particles, using a low-pressure semi-permeable membrane. Popular for its cost-effectiveness and high pathogen removal rate, ultrafiltration does not require any chemicals and is also environmentally friendly. However, it cannot remove certain contaminants such as dissolved salts and heavy metals.

UV Disinfection

UV disinfection is a safe, effective, and chemical-free water purification technology that uses UV light to eliminate harmful microorganisms. By penetrating cell membranes and damaging microbial DNA, UV light prevents these organisms from reproducing and causing infections. While this method is used as the final disinfection step in many industries, it cannot remove physical or chemical contaminants.

What Key Industries Depend Heavily on Pure Water?

Across heavily regulated industries such as food and beverage, medical, pharmaceutical, and electronics, water purity is not just preferred, it is legally required. This is because Australia’s regulatory bodies enforce strict hygiene and safety protocols to ensure product integrity and maintain public health, as well as that of the environment. Various types of pure water are also essential for research and development (R&D) studies and power plants, ensuring accurate experimental outcomes and preventing damage to machinery.

What Are the Common Pitfalls of Selecting Pure Water?

From compromised results to equipment damage, the risks of overlooking the vital factors in selecting the right type of pure water can weigh heavily on your entire operation. Let’s have a look at the common pitfalls early on so that you can protect your laboratory from such setbacks:

Underestimating the Required Volume of Pure Water

To avoid this, we recommend making an approximate calculation of the daily water requirements for each type, based on factors such as daily volume usage and peak usage periods. It is also advisable to plan for a slightly higher volume to account for times when greater volumes may be needed.

Inadequate Assessment of Water Quality Requirements

This is another common pitfall that can lead to inaccurate results and resource wastage, as some applications don’t require a higher grade of water purity.

Neglecting to Monitor Water Quality Parameters

Failing to track water quality during operation can allow contamination or system inefficiencies to go unnoticed, compromising results and increasing maintenance costs. 

How Should Lab Managers Choose the Right Water Type for Their Needs?

While it’s essential to be aware of the factors commonly overlooked when selecting the right water type, it’s equally important to understand the criteria that guide the right choice. Here’s what we recommend:

  1. Identify the water purity and total volume required for each laboratory application to achieve the desired quality while minimising any additional costs.
  2. Select systems with multi-stage filtration to ensure effective contaminant removal, resulting in lower long-term costs, although this may vary depending on your specific requirements. 
  3. Consider your lab layout and building design to determine the most efficient solution (e.g., loop, floor-by-floor setups, or individual systems).
  4. Ensure the system you choose complies with monitoring, certification, and regulatory standards.
  5. Pay attention to maintenance and service requirements to avoid unnecessary expenses and to ensure the system operates efficiently throughout its lifespan. 

Across International: Trusted ISO-Certified Manufacturer for High-Quality Lab Equipment

While pure water plays a key role in several industries, it’s the right type of equipment that makes its use effective, especially in laboratories and industrial settings. At Across International, we manufacture a wide range of laboratory-grade equipment for various fields and applications, including freezers and refrigerators, rotary evaporators, ovens, and vacuum pumps. As an ISO-certified manufacturer with over 20 years of expertise, our supplies are designed to meet the highest international standards required by research facilities and laboratories.  

Reach out to our team at Across International Australia to discover how we can assist you in achieving greater efficiency and control in your laboratory and industrial workflows. 


Water Purity FAQ

Frequently Asked Questions

What is the difference between distilled water and ultrapure water?

Distilled water is produced through simple distillation, a process that involves boiling and then condensing the water. Ultrapure water, in contrast, undergoes multiple advanced treatment processes to achieve a very high level of purity (not safe for consumption).

How is Type I water made?

It’s made through a multi-stage purification process that typically involves advanced purification technologies such as reverse osmosis, deionisation, and other filtration methods.

Is ultrapure water sterile?

No. Although it’s pure and free from many contaminants, further treatment processes are required to make it sterile, such as autoclaving.

Should I store Type I water?

Whenever possible, storage should be avoided, but if necessary, a recirculating, UV-treated, airtight reservoir can be used only for short periods.

When is Type II water preferred over Type I?

Type II water is generally preferred for less critical laboratory applications that don’t require high-purity water, such as instrument feeding and microbiological applications.

T
What is Type III water mainly used for?

With the lowest purity grade of laboratory water, Type III is typically used for tasks like glassware rinsing or heating baths.

Do laboratories commonly use Type IV pure water?

No, it is not used for critical tasks as it has the lowest purity grade (ASTM classification). But it is commonly used as feed to produce pure water.

Share this post
Tags
Archive