Gas detectors can be difficult to maintain, especially when you need to perform frequent instrument calibration. Calibration is not a particularly challenging or time consuming process. It only requires a few minutes and a few button presses. The tough part is finding the time in your schedule when countless other tasks demand your attention. As your days get busier, it gets harder to find the time to stop and maintain something that may not be showing signs of wear and won’t help finish your job faster.

Instrument manufacturers typically recommend a bump test before each day’s use and monthly calibrations to ensure proper instrument performance. Unfortunately for the industry, there is a dangerous rumor that industry-standard maintenance recommendations are unnecessary and manufacturer recommendations are just a ploy to sell calibration gas.

The popularity of these rumors has spiked recently with the introduction of multi-gas instruments with new low-power, infrared sensor technology for combustible gas detection that claim to perform for up to two years without needing to be calibrated. These low-power infrared sensors deliver on their claim of extended instrument runtime, but these instruments use the same electrochemical sensor technology for toxic gas detection, so why wouldn’t they need calibration?

Instrument manufacturers’ calibration recommendations are based upon many factors, one being sensor drift. Sensor drift is the natural tendency of a sensor’s performance to degrade over time as its components age. This is an undeniable fact for electrochemical sensor technology.

Factors that contribute to sensor drift

In September 2013, OSHA published a Safety and Health Information Bulletin titled “Calibrating and Testing Direct-Reading Portable Gas Monitors.” In this bulletin, OSHA identified nine factors that contribute to sensor drift.  Seven of these factors relate to electrochemical sensors:

  1. Degradation of phosphorus-containing components
  2. Degradation of lead-containing components
  3. Gradual chemical degradation of sensors and drift in electronic components that occur normally over time
  4. Use in extreme environmental conditions, such as high/low temperature and humidity, and high levels of airborne particulates
  5. Exposure to high concentrations of the target gases and vapors
  6. Exposure of electrochemical toxic gas sensors to solvent vapors and highly corrosive gases
  7. Handling/jostling of the equipment causing enough vibration or shock over time to affect electronic components and circuitry.

The third factor is sensor drift and is typically defined by sensor manufacturers as <2% to <5% per month. In other words, a sensor that detected 100ppm immediately after calibration may read as low as 95ppm after one month while ignoring the impact of other environmental factors. Sensor specifications are based on laboratory testing, however they will perform worse than specification if they’re constantly subjected to challenging applications and environments.

For simplicity’s sake, the following example will ignore all other causes of sensor drift including the other seven factors listed above as well as temporary drift caused by sudden changes in temperature and humidity. The two graphs shown assume a 2% monthly sensor drift for carbon monoxide (CO) and hydrogen sulfide (H2S) sensors and standard calibration concentrations of 100ppm and 25ppm, respectively.  The compounding effect of a 2% monthly sensor drift alone results in 38% lower readings after 24 months and 62% lower readings after 48 months. In other words, after two years, an instrument in a hazardous environment exposed to 100ppm of CO and 25ppm H2S could display 62ppm of CO and 15.4ppm of H2S. After four years, the readings would be 38ppm of CO and 9.5 ppm of H2S. Again, these graphs ignore all other potential causes of instrument inaccuracy except for natural sensor drift. After four years, assuming standard alarm set points, neither instrument would produce a high alarm. The H2S reading wouldn’t even trigger a low alarm. 

One common argument against the need for routine calibration is using a periodic bump test to validate sensor performance. A bump test is designed to ensure the instrument will detect the presence of a gas, not to validate the accuracy of the measurement. A common standard to pass a bump test is the instrument must detect 50% of the calibration gas concentration exposed to the instrument. In these examples, the instrument would be required to detect 50ppm of CO and 17.5ppm of H2S. Only after 34 months of gradual sensor drift would the instruments would fail a bump test. Bump tests are incredibly important tools, but never should be considered as an alternative to instrument calibrations.

For example, your coworker borrowed your brand new gas detector yesterday for a few hours. On his way back to the office, he accidentally dropped it in the mud, which clogged the sensor openings. If you bump test the instrument before your next use, the bump test will catch the problem and fail, as the gas will not reach the sensors. It will not adjust the measurement accuracy in any way, only test the ability of gas to reach the sensor.

One way to ensure proper instrument performance and reduce maintenance hassles is to use a docking station or calibration station. These devices automate your routine bump tests and calibrations plus download datalogs, update settings and firmware, and, most importantly, allow you to focus on your job.

While no one will argue that frequent instrument calibration isn’t a hassle, its importance should never be minimized. The examples above described how an improperly calibrated gas detector could lead to wildly inaccurate readings over time. Gas detectors are life-saving devices. Don’t let misleading information or a few minutes of maintenance get in the way of having a life saving device.