Standard Guide for The Use of Various Turbidimeter Technologies for Measurement of Turbidity in Water
|Publication Date:||1 May 2011|
|ICS Code (Optical measuring instruments):||17.180.30|
This guide covers the best practices for use of various turbidimeter designs for measurement of turbidity in waters including: drinking water, wastewater, industrial waters, and for regulatory and environmental monitoring. This guide covers both continuous and static measurements.
In principle there are three basic applications for on-line measurement set ups. The first is the bypass or slipstream technique; a portion of sample is transported from the process or sample stream and to the turbidimeter for analysis. It is then either transported back to the sample stream or to waste. The second is the in-line measurement; the sensor is submerged directly into the sample or process stream, which is typically contained in a pipe. The third is in-situ where the sensor is directly inserted into the sample stream. The in-situ principle is intended for the monitoring of water during any step within a processing train, including immediately before or after the process itself.
Static covers both benchtop and portable designs for the measurement of water samples that are captured into a cell and then measured.
Depending on the monitoring goals and desired data requirements, certain technologies will deliver more desirable results for a given application. This guide will help the user align a technology to a given application with respect to best practices for data collection.
Some designs are applicable for either a lower or upper measurement range. This guide will help provide guidance to the best-suited technologies based given range of turbidity.
Modern electronic turbidimeters are comprised of many parts that can cause them to produce different results on samples. The wavelength of incident light used, detector type, detector angle, number of detectors (and angles), and optical pathlength are all design criteria that may be different among instruments. When these sensors are all calibrated with the sample turbidity standards, they will all read the standards the same. However, samples comprise of completely different matrices and may measure quite differently among these different technologies.
This guide does not provide calibration information but rather will defer the user to the appropriate ASTM turbidity method and its calibration protocols. When calibrated on traceable primary turbidity standards, the assigned turbidity units such as those used in Table 1 are equivalent. For example, a 1 NTU formazin standard is also equivalent in measurement magnitude to a 1 FNU, a 1 FAU, and a 1 BU standard and so forth.
Improved traceability beyond the scope of this guide may be practiced and would include the listing of the make and model number of the instrument used to determine the turbidity values.
This guide does not purport to cover all available technologies for high-level turbidity measurement.
The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.
This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.
This guide does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. Refer to the MSDSs for all chemicals used in this procedure.