This paper discusses the integration of air-to-air visual measurements within a cooperative multi-vehicle architecture conceived to improve navigation performance of small Unmanned Aerial Vehicles in GNSS-challenging environments. The key concept is to exploit cooperation with other aircraft flying under better GNSS coverage conditions by exchanging navigation data and using a monocular camera system for relative sensing purposes. Specifically, accurate line of sight, obtained using Deep Learning-based detectors and local image analysis, are complemented with distance information achieved through an innovative shape-based passive ranging approach accounting for both the target attitude and its position in the field of view. Camera-based measurements are combined with the estimates generated by the other onboard sensors, within a customized Extended Kalman Filter in a closed loop fashion, since navigation estimates are used in feedback as hints for visual processing. An experimental flight test campaign is carried out using two quadcopters. A comparison between filter performance achievable as a function of the specific set of available information sources, i.e., bearing-only vs. line of sight and ranging, is carried out. Results show that the trade-off between correct, false, and missed detections, as well as the passive ranging accuracy allow the filter ensuring metric-level positioning error within GNSS-challenging areas. The added value of using both bearing and range measurements strongly depends on the formation geometry and the GNSS coverage conditions and can be predicted thanks to the “generalized dilution of precision”.
Closed loop integration of air-to-air visual measurements for cooperative UAV navigation in GNSS challenging environments / Causa, Flavia; Opromolla, Roberto; Fasano, Giancarmine. - In: AEROSPACE SCIENCE AND TECHNOLOGY. - ISSN 1270-9638. - 130:(2022), pp. 1-21. [10.1016/j.ast.2022.107947]
Closed loop integration of air-to-air visual measurements for cooperative UAV navigation in GNSS challenging environments
Flavia Causa;Roberto Opromolla;Giancarmine Fasano
2022
Abstract
This paper discusses the integration of air-to-air visual measurements within a cooperative multi-vehicle architecture conceived to improve navigation performance of small Unmanned Aerial Vehicles in GNSS-challenging environments. The key concept is to exploit cooperation with other aircraft flying under better GNSS coverage conditions by exchanging navigation data and using a monocular camera system for relative sensing purposes. Specifically, accurate line of sight, obtained using Deep Learning-based detectors and local image analysis, are complemented with distance information achieved through an innovative shape-based passive ranging approach accounting for both the target attitude and its position in the field of view. Camera-based measurements are combined with the estimates generated by the other onboard sensors, within a customized Extended Kalman Filter in a closed loop fashion, since navigation estimates are used in feedback as hints for visual processing. An experimental flight test campaign is carried out using two quadcopters. A comparison between filter performance achievable as a function of the specific set of available information sources, i.e., bearing-only vs. line of sight and ranging, is carried out. Results show that the trade-off between correct, false, and missed detections, as well as the passive ranging accuracy allow the filter ensuring metric-level positioning error within GNSS-challenging areas. The added value of using both bearing and range measurements strongly depends on the formation geometry and the GNSS coverage conditions and can be predicted thanks to the “generalized dilution of precision”.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.