D4SC: Deep Supervised Semantic Segmentation for Seabed Characterization and Uncertainty Estimation for Large Scale Mapping

Research output: Contribution to journalArticlepeer-review

Abstract

Seabed characterization consists in the study of the physical and biological properties of the of ocean floors. Sonar is commonly employed to capture the acoustic backscatter reflected from the seabed. It has been extensively used for automatic target recognition (ATR) within mine countermeasures (MCM) operations in shallow waters. However, conventional machine learning (ML) and deep learning approaches face challenges in automatically mapping the seabed due to noise and limited labels. Thus, this article introduces the Deep Supervised Semantic Segmentation model for Seabed Characterization (D4SC), tailored for addressing challenges associated with sonar data. D4SC employs convolutional neural networks, specific high-resolution (HR) synthetic aperture sonar (SAS) data preprocessing and data augmentation methods, including the novel boundary pixel label rejection, and moves from the low-label regime. Performance comparisons against standard methods in the literature are conducted, demonstrating D4SC's superiority on challenging HR SAS survey datasets from real-world MCM exercises at sea. In addition, this work thoroughly explores the effect of the quality of the datasets, the robustness of training models on Out-of-Distribution data, and the estimation of epistemic uncertainty to refine predictions at large scale.

Original languageEnglish
Pages (from-to)18038-18057
Number of pages20
JournalIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Volume17
DOIs
Publication statusPublished - 2024

Keywords

  • Deep learning (DL)
  • image segmentation
  • synthetic aperture sonar (SAS)
  • uncertainty

Fingerprint

Dive into the research topics of 'D4SC: Deep Supervised Semantic Segmentation for Seabed Characterization and Uncertainty Estimation for Large Scale Mapping'. Together they form a unique fingerprint.

Cite this