Abstract
Underwater acoustic ranging (UAR) plays a crucial role in estimating object distances for ocean exploration. However, a reliable UAR method remains elusive, with current approaches either being reliant on inadequate hand-crafted features or neglecting the unique underwater acoustic properties. To address this, we propose Multi-attentional Underwater Acoustic Ranging (MUAR), a highly effective and robust UAR framework. MUAR incorporates multiple attention mechanisms tailored to the acoustic properties. Specifically, to better leverage the rich channel information in UAR data, we design a grouped channel attention module that can efficiently capture informative channels of the input data. Then, a feature-balancing strategy based on spatial-attention is introduced to mitigate information redundancy and conflicts, thereby enhancing the multi-level expressive capability of the model. We further theoretically analyze the connection between the self-attention mechanism and the acoustical signal correlations, such that achieving a better interpretation for the extracted features. Through extensive experiments and analysis on three authentic datasets, we show that MUAR outperforms previous approaches by obtaining state-of-the-art performance, i.e, achieving a MSE of 0.44 (vs. 2.72) and a MAPE of 0.97 (vs. 2.42). The source code of the proposed MUAR is released at https://github.com/TiernosChu/MUAR.
| Original language | English |
|---|---|
| Article number | 111560 |
| Journal | Pattern Recognition |
| Volume | 164 |
| DOIs | |
| Publication status | Published - Aug 2025 |
Keywords
- Attention mechanism
- Deep learning
- Multi-scale feature fusion
- Remote sensing
- Underwater acoustic ranging