The wakes generated by moving vessels represent relevant patterns in remotely sensed images. They are a marker of ship presence and can be processed to infer route, speed, size, and type of ships. Automatic wake detection can be exploited by law enforcement agencies and local authorities for ensuring a wide range of applications, including maritime traffic surveillance, border control, and protection of marine protected areas. The topic is thus attracting increasing interest from the remote sensing community. This letter contributes in this context presenting a novel approach based on the detection of the keypoints of wake components by convolutional neural networks (CNNs) in electro-optical satellite imagery. The selected approach to deep learning (DL) relies on a transfer-learning procedure fine-tuning the ImageNet weights. This is performed through an ad hoc developed dataset realized from Sentinel-2 multispectral images and automatic identification system (AIS) data in northern Europe. The experimental results confirm the robustness of the proposed method, which is tested against different spectral bands from visible to near-infrared and also by a domain shifting on lower resolution Landsat-9 images. Fractional errors in the positioning of the wake vertex are lower than 10% and the achieved heading accuracy is below 10°. The proposed method is faster than the traditional approaches based on Radon transform (RT), and due to its lightweight nature, our model can be executed efficiently on edge-AI devices, enabling real-time processing onboard.

Keypoints Method for Recognition of Ship Wake Components in Sentinel-2 Images by Deep Learning / Del Prete, Roberto; Graziano, MARIA DANIELA; Renga, Alfredo. - In: IEEE GEOSCIENCE AND REMOTE SENSING LETTERS. - ISSN 1545-598X. - 20:(2023), pp. 1-5. [10.1109/LGRS.2023.3324303]

Keypoints Method for Recognition of Ship Wake Components in Sentinel-2 Images by Deep Learning

Roberto Del Prete;Maria Daniela Graziano;Alfredo Renga
2023

Abstract

The wakes generated by moving vessels represent relevant patterns in remotely sensed images. They are a marker of ship presence and can be processed to infer route, speed, size, and type of ships. Automatic wake detection can be exploited by law enforcement agencies and local authorities for ensuring a wide range of applications, including maritime traffic surveillance, border control, and protection of marine protected areas. The topic is thus attracting increasing interest from the remote sensing community. This letter contributes in this context presenting a novel approach based on the detection of the keypoints of wake components by convolutional neural networks (CNNs) in electro-optical satellite imagery. The selected approach to deep learning (DL) relies on a transfer-learning procedure fine-tuning the ImageNet weights. This is performed through an ad hoc developed dataset realized from Sentinel-2 multispectral images and automatic identification system (AIS) data in northern Europe. The experimental results confirm the robustness of the proposed method, which is tested against different spectral bands from visible to near-infrared and also by a domain shifting on lower resolution Landsat-9 images. Fractional errors in the positioning of the wake vertex are lower than 10% and the achieved heading accuracy is below 10°. The proposed method is faster than the traditional approaches based on Radon transform (RT), and due to its lightweight nature, our model can be executed efficiently on edge-AI devices, enabling real-time processing onboard.
2023
Keypoints Method for Recognition of Ship Wake Components in Sentinel-2 Images by Deep Learning / Del Prete, Roberto; Graziano, MARIA DANIELA; Renga, Alfredo. - In: IEEE GEOSCIENCE AND REMOTE SENSING LETTERS. - ISSN 1545-598X. - 20:(2023), pp. 1-5. [10.1109/LGRS.2023.3324303]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/948605
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact