Low-cost light-scattering particulate matter (PM) sensors are often advocated for the dense monitoring networks. Recent literature has focused on evaluating their performance. Nonetheless, low-cost sensors are also considered unreliable and imprecise. Consequently, exploring techniques for anomaly detection, resilient calibration, and data quality improvement should be discussed more. In this study, we analyse a year-long acquisition campaign by positioning 56 low-cost light-scattering sensors near the inlet of an official PM monitoring station. We use the collected measurements to design and test a data processing pipeline composed of different stages, including fault detection, filtering, outlier removal, and calibration. These can be used in large-scale deployment scenarios where the quantity of sensors' data can be too high to be analysed manually. Our framework also exploits sensor redundancy to improve reliability and accuracy. Our results show that the proposed data processing framework produces more reliable measurements, reduces errors, and increases the correlation with the official reference.
Improving Data Quality of Low-Cost Light-Scattering PM Sensors: Toward Automatic Air Quality Monitoring in Urban Environments / Ramirez-Espinosa, G.; Chiavassa, P.; Giusto, E.; Quer, S.; Montrucchio, B.; Rebaudengo, M.. - In: IEEE INTERNET OF THINGS JOURNAL. - ISSN 2327-4662. - 11:17(2024), pp. 28409-28420. [10.1109/JIOT.2024.3405623]
Improving Data Quality of Low-Cost Light-Scattering PM Sensors: Toward Automatic Air Quality Monitoring in Urban Environments
Giusto E.;
2024
Abstract
Low-cost light-scattering particulate matter (PM) sensors are often advocated for the dense monitoring networks. Recent literature has focused on evaluating their performance. Nonetheless, low-cost sensors are also considered unreliable and imprecise. Consequently, exploring techniques for anomaly detection, resilient calibration, and data quality improvement should be discussed more. In this study, we analyse a year-long acquisition campaign by positioning 56 low-cost light-scattering sensors near the inlet of an official PM monitoring station. We use the collected measurements to design and test a data processing pipeline composed of different stages, including fault detection, filtering, outlier removal, and calibration. These can be used in large-scale deployment scenarios where the quantity of sensors' data can be too high to be analysed manually. Our framework also exploits sensor redundancy to improve reliability and accuracy. Our results show that the proposed data processing framework produces more reliable measurements, reduces errors, and increases the correlation with the official reference.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.