There seems to be some confusion about the calibration of gas monitors. It’s very simple, really. Gas monitors are life-saving devices. They are precision instruments that are intended solely to measure and monitor potentially lethal gases in the workplace. The only way to ensure that a gas monitor will accurately respond to the hazardous gas or gases it is designed to detect is to calibrate the sensors against a known gas standard.

Gas sensors are the heart of a gas monitoring instrument, and the foundation of gas detection. Sensor technologies most commonly used for confined space and personal monitoring include catalytic diffusion for combustible gases and electrochemical sensors for oxygen and toxic gases. Each has special characteristics and calibration requirements, but none are immune to the eventual need for verification of the sensor’s response to a known concentration of a target gas.

“Drift” away

Over time gas sensors can fall out of tolerance and “drift” from the calibration specifications. Scientific calculations of sensor degradation and sensitivity reduction show long-lasting performance based on ideal lab conditions. Given this, it should be possible for sensors to last two, three or four years without any significant loss of sensitivity. A lot of sensors do, indeed, last that long. Technically, the sensors are good enough to withstand just about any variation of calibration procedures.

However, the issue really isn’t how good the sensors are, it is how they are used. Unfortunately, the industrial work environment in which instruments are used is not a laboratory, and there are many unknowns that these life-saving devices face every day that may affect the accuracy and reliability of the sensor performance. There are a number of reasons why a sensor may unexpectedly lose sensitivity, drift or fail to respond accurately to the target gas. These reasons may include sensor poisoning, leakage, overexposure, temperature and humidity extremes, or physical damage due to dropping or immersion.

Electrochemical sensors, commonly used for carbon monoxide, hydrogen sulfide and oxygen monitoring, are normally stable, and degradation is slow. However, normal degradation of electrochemically based toxic and oxygen sensors is accelerated by low humidity and high temperatures due to the chemical reactions and consumption of the electrolyte. Catalytic sensors are susceptible to poisoning when exposed to substances containing silicon, halogenated hydrocarbons and high concentrations of hydrogen sulfide.

Also, any sensor may be rendered useless if hit with extremely high concentrations of the target gas. Some monitors combat this with an “over-range” feature, which kills the power to the sensor at a certain limit so that it has limited exposure to the gas to preserve the life of the sensor.

Contrary to what you may have been led to believe, there is no electronic method for compensation or self-calibration of sensors that will correct the effects of drops, shocks or extreme exposures to gas or temperatures. These factors affect the sensor’s ability to react to gas at the maximum accuracy possible.

Verifying calibration

With so many reasons why a sensor can lose sensitivity and fail to respond in a gas hazard situation, frequent confirmation of the sensor’s performance is justified. There are two methods of verifying instrument calibration:

  • A “bump” or functional test is defined as the brief exposure of the monitor to a concentration of gas(es) in excess of the lowest alarm set-point for each sensor. The instrument reading is compared to the actual concentration of gas, and if it is within an acceptable range of the actual concentration (usually within 10%) then its calibration is verified.

    A bump test ensures that the sensors are working properly. When performing a bump test, the test gas concentration should be high enough to trigger the instrument alarm for each sensor. If a functional test fails, then the instrument must be adjusted through a full calibration before it is used.

  • A full calibration goes a step further than a functional test and ensures maximum accuracy of the instrument if performed successfully. Again, using a known concentration of test gas, the instrument reading is compared to the actual concentration of the gas and then adjustments are made to the readings if they do not match. Today, most direct-reading instruments offer quick, push-button calibration with electronic corrections in place of older potentiometer adjustments. If a sensor fails calibration it should be replaced and the instrument must be recalibrated.

    Recently, OSHA published a Bulletin specifically addressing the need for regular calibration of direct-reading portable gas monitors. Although it is not a standard or regulation, the Bulletin is a clear recommendation to follow the guidelines put forth in the position statement released by the International Safety Equipment Association (ISEA) on instrument calibration for gas monitors used in confined spaces. The ISEA statement says, “A bump test or full calibration of direct-reading portable gas monitors should be made before each day’s use in accordance with manufacturer’s instructions, using an appropriate test gas.”

    The right gas

    The most important tool for accurate calibrations is the test gas itself. Always ensure that the gas cylinder has not reached its expiration date before calibration. The type and concentration of the gas, sample tubing, regulators and calibration adapters must be appropriate for the instrument and sensors. Combustible gas sensors are non-specific and can be calibrated to any number of different gases. Choose the calibration gas that most closely matches the gas that will be encountered.

    If the gas is unknown, or if a mix of gases is suspected, then calibrating with pentane gas is recommended. Using pentane as a calibration standard will allow the sensor to detect a larger group of hydrocarbons commonly found in confined spaces. Today, many multi-blend cylinders of gas are manufactured to simplify the task of calibrating multi-gas monitors. Match the cylinder contents to the sensors installed and make sure the concentrations will handle the instrument alarm setpoints.

    For verification of accuracy, calibration gas should be gravimetrically produced and traceable to the U.S. National Institute for Standards and Technology (NIST). NIST traceability comes from certified, traceable weights used in the gravimetric filling process. This means the gases are produced using a calculation of gas weight based on the molecular weight of each gas component in the mixture. The gravimetric process is the most accurate method for producing calibration gases and is not affected by temperature or pressure.

    Other methods, including dynamic blending and pressure blending, are dependent on temperature and pressure. Supporting documentation and certificates of analysis should be available from the calibration gas manufacturer to verify the process used and the results of the gas mixture as proof of accuracy.

    Develop a schedule

    The importance of regular instrument calibration is critical to prevent inaccurate readings. There is no global standard or universal procedure to direct companies, mainly because many types of instruments are used in various environments and use conditions. OSHA instructions are to follow the manufacturers’ recommendations, which shift the responsibility to both the user and the manufacturer of the monitor.

    The best way to ensure regular instrument calibration is to develop a procedure that includes a schedule for bump testing and full calibration for all gas detectors in a company’s fleet. Automated calibration stations or full function instrument management systems can be conveniently programmed to bump test or calibrate instruments on schedule or on demand. Company procedures and best practices should be driven by the usage patterns, operating conditions and environmental conditions present.

    Don’t take chances. Calibrate with knowledge — and confidence.

    SIDEBAR: How often are full calibrations needed?

    Perhaps the biggest point of contention about calibration is the frequency between full calibrations and functional or bump testing the instruments. ISEA recommends the following, if conditions do not permit daily testing:

    During a period of initial use of at least ten days in the intended atmosphere, calibration is verified daily to ensure there is nothing in the atmosphere to poison the sensor(s). The period of initial use must be of sufficient duration to ensure that the sensors are exposed to all conditions that might adversely affect the sensors.

    If the tests demonstrate that no adjustments are necessary, the interval between checks may be lengthened, but it should not exceed 30 days.

    ISEA recommends more frequent testing if environmental conditions that could affect instrument performance are suspected, such as sensor poisons.