Several methods for fire detection from camera streams have been proposed in recent years. While traditional techniques often emphasize recall, they frequently neglect critical factors such as minimizing false positives, ensuring timely alarm notifications and optimizing performance for devices with limited computational resources. The ONFIRE 2023 contest evaluates various approaches for detecting fire using smart cameras and establishes new evaluation metrics to measure precision, recall, notification promptness, processing speed and resource utilization. The eight participating teams received a training set that integrated all publicly available video datasets and were evaluated on a private test set. The latter includes positive samples where fire is not present at the beginning of the clip, as well as negative samples featuring mov- ing fire-like objects. In this paper, we provide an overview of the competition’s dataset and review the proposed solutions, highlighting the winning approach, the limitations of existing datasets and the evaluation metrics used. By analyzing the results of the competition, we propose possible design choices and future directions that may help to reduce the false positive rate while preserving accuracy.
Onfire 2023 Contest: what did we learn about real time fire detection from cameras? / Gragnaniello, Diego; Greco, Antonio; Sansone, Carlo; Vento, Bruno. - In: JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING. - ISSN 1868-5137. - 16:1(2025), pp. 253-264. [10.1007/s12652-024-04939-z]
Onfire 2023 Contest: what did we learn about real time fire detection from cameras?
Gragnaniello, Diego;Sansone, Carlo;Vento, Bruno
2025
Abstract
Several methods for fire detection from camera streams have been proposed in recent years. While traditional techniques often emphasize recall, they frequently neglect critical factors such as minimizing false positives, ensuring timely alarm notifications and optimizing performance for devices with limited computational resources. The ONFIRE 2023 contest evaluates various approaches for detecting fire using smart cameras and establishes new evaluation metrics to measure precision, recall, notification promptness, processing speed and resource utilization. The eight participating teams received a training set that integrated all publicly available video datasets and were evaluated on a private test set. The latter includes positive samples where fire is not present at the beginning of the clip, as well as negative samples featuring mov- ing fire-like objects. In this paper, we provide an overview of the competition’s dataset and review the proposed solutions, highlighting the winning approach, the limitations of existing datasets and the evaluation metrics used. By analyzing the results of the competition, we propose possible design choices and future directions that may help to reduce the false positive rate while preserving accuracy.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


