Universal Controls recently received a phone call from a client who was concerned about CO₂ sensor accuracy. In a facility with multiple sensors for environmental, as well as, enrichment control the sensors were all reading slightly differently. Which one is correct? The short answer; all of them. Let’s explain.
What is being measured?
Carbon dioxide detectors are manufactured, tested and rated on a sensitivity range and the level of sensitivity required depends upon the conditions and what is being measured. Carbon dioxide detectors measure CO₂ in “parts per million” (ppm) and there are different ranges of detection per meter based upon its intended use.
Why Do Sensors Read Differently?
A CO₂ sensor installed as part of an HVAC system will measure CO₂ levels for comfort in ranges from 350ppm to about 450ppm. Outside air is about 380 ppm and, according to ASHRE, a maximum indoor comfort level is about 1000ppm.
A CO₂ sensor installed as part of an enrichment system will be measuring a higher and wider range of CO₂ levels, not only for effective enrichment, but also for life safety and therefore must have a wider range of sensitivity. A level of CO₂ of about 1500ppm is an enriched environment and sensors used in this environment must not only detect the correct enrichment levels, but also warn when levels are dangerous. The maximum safe level of CO₂ in an industrial environment is 5000ppm exposure over an eight-hour period. The sensors, per building code requirements, will through a required alarm system, alert occupants that at levels greater than 5000ppm the environment is dangerous and at 10000ppm will warn occupants to evacuate. Universal Controls specifies and installs sensors that are rated to 50000ppm.
How is accuracy defined?
Accuracy is determined by repeatedly testing the sensor against a reference gas with a known ppm value. All readings are recorded, and the range of readings defines the sensor’s accuracy. The readings are expressed as either a ± (plus-minus) value in ppm or as a percentage of the measured value, or a combination of both. If a sensor is repeatedly tested with a gas of a known quantity of 10000ppm and the sensor reads between 9900ppm and 10100ppm then the reading is expressed as ± 100ppm or 2% (200/10,000). When sensors repeatedly perform within their pre-determined ranges, then they are considered accurate.
What can cause inaccuracies?
Carbon dioxide sensor inaccuracy can be caused predominantly by the following: the method of sensor calibration (field calibration versus auto-calibration, more on this in a bit), frequency of calibration, quality of manufacturing and environmental factors such as exposure to water and airborne pollutants.
Field Calibration Versus Auto-Calibration
Sensors with automatic baseline calibration (ABC) capabilities store their readings over a set interval of time, then take the lowest reading during that interval and compare it to the natural occurring CO₂ in the atmosphere (about 400ppm). The sensor is programmed to believe that 400ppm is around the lowest reading, thus if its lowest reading in its memory is more-or-less than 400ppm, then the sensor corrects its readings. Auto-calibration only works in environments that naturally reach around 400ppm or equal too outside air.
Using auto-calibrated sensors in an enriched environment is NOT recommended. The CO₂ levels will vary greatly over time and there is no guarantee that the lowest level stored in sensor memory will be close to 400ppm, thus leading to potentially large inaccurate readings. Universal Controls specifies and installs high quality, field calibrated CO₂ sensors that have been tested by our trained engineers to provide our customers with the utmost in accurate enrichment, as well as, completely code compliant and safe working conditions. Universal Controls wants to help your business be productive, compliant, fun and safe!