Surveillance solutions for Advanced and Urban Air Mobility frameworks are a key factor to enable safe operations of highly automated aircraft in the civil airspace. To design solutions suitable for all types of aircraft, non-cooperative sensors can be used, though many challenges arise when the small dimensions of the vehicles and their proximity to the ground during low-altitude missions are considered. A distributed sensing concept can be efficiently applied to address these challenges by exploiting multiple sensors within a surveillance network. This paper proposes a strategy to fuse the information collected by three ground-fixed cameras within a network of multiple distributed sensors and is tested during experimental flight tests. The solution exploits standalone tracking estimates of each camera within a fusion center that performs triangulation and three-dimensional tracking. This approach is tested in a scenario involving two small UAVs flying at low altitude. The paper deals with the challenges of associating the two objects from independent and unrelated tracks to achieve robust triangulation, which produces meter-level mean errors with respect to GNSS-based ground truth.

Distributed Visual Sensing and Fusion for Advanced Air Mobility / Vitiello, Federica; Causa, Flavia; Opromolla, Roberto; Fasano, Giancarmine; Dolph, Chester; Ferrante, Todd A.; Lombaerts, Thomas; Ippolito, Corey A.. - (2024), pp. 1-8. (Intervento presentato al convegno Digital Avionics Systems Conference (DASC) tenutosi a San Diego, CA, USA nel 29 Settembre 2024 - 03 Ottobre 2024) [10.1109/DASC62030.2024.10749118].

Distributed Visual Sensing and Fusion for Advanced Air Mobility

Federica Vitiello;Flavia Causa;Roberto Opromolla;Giancarmine Fasano;
2024

Abstract

Surveillance solutions for Advanced and Urban Air Mobility frameworks are a key factor to enable safe operations of highly automated aircraft in the civil airspace. To design solutions suitable for all types of aircraft, non-cooperative sensors can be used, though many challenges arise when the small dimensions of the vehicles and their proximity to the ground during low-altitude missions are considered. A distributed sensing concept can be efficiently applied to address these challenges by exploiting multiple sensors within a surveillance network. This paper proposes a strategy to fuse the information collected by three ground-fixed cameras within a network of multiple distributed sensors and is tested during experimental flight tests. The solution exploits standalone tracking estimates of each camera within a fusion center that performs triangulation and three-dimensional tracking. This approach is tested in a scenario involving two small UAVs flying at low altitude. The paper deals with the challenges of associating the two objects from independent and unrelated tracks to achieve robust triangulation, which produces meter-level mean errors with respect to GNSS-based ground truth.
2024
979-8-3503-4961-0
979-8-3503-4962-7
Distributed Visual Sensing and Fusion for Advanced Air Mobility / Vitiello, Federica; Causa, Flavia; Opromolla, Roberto; Fasano, Giancarmine; Dolph, Chester; Ferrante, Todd A.; Lombaerts, Thomas; Ippolito, Corey A.. - (2024), pp. 1-8. (Intervento presentato al convegno Digital Avionics Systems Conference (DASC) tenutosi a San Diego, CA, USA nel 29 Settembre 2024 - 03 Ottobre 2024) [10.1109/DASC62030.2024.10749118].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/991305
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact