跳到主要导航 跳到搜索 跳到主要内容

Attention-driven acoustic properties learning for underwater target ranging

  • Xiaohui Chu
  • , Hantao Zhou
  • , Yan Zhang
  • , Yachao Zhang
  • , Runze Hu*
  • , Haoran Duan
  • , Yawen Huang
  • , Yefeng Zheng
  • , Rongrong Ji
  • *此作品的通讯作者
  • Beijing Institute of Technology
  • Tsinghua University
  • Xiamen University
  • Durham University
  • Tencent

科研成果: 期刊稿件文章同行评审

摘要

Underwater acoustic ranging (UAR) plays a crucial role in estimating object distances for ocean exploration. However, a reliable UAR method remains elusive, with current approaches either being reliant on inadequate hand-crafted features or neglecting the unique underwater acoustic properties. To address this, we propose Multi-attentional Underwater Acoustic Ranging (MUAR), a highly effective and robust UAR framework. MUAR incorporates multiple attention mechanisms tailored to the acoustic properties. Specifically, to better leverage the rich channel information in UAR data, we design a grouped channel attention module that can efficiently capture informative channels of the input data. Then, a feature-balancing strategy based on spatial-attention is introduced to mitigate information redundancy and conflicts, thereby enhancing the multi-level expressive capability of the model. We further theoretically analyze the connection between the self-attention mechanism and the acoustical signal correlations, such that achieving a better interpretation for the extracted features. Through extensive experiments and analysis on three authentic datasets, we show that MUAR outperforms previous approaches by obtaining state-of-the-art performance, i.e, achieving a MSE of 0.44 (vs. 2.72) and a MAPE of 0.97 (vs. 2.42). The source code of the proposed MUAR is released at https://github.com/TiernosChu/MUAR.

源语言英语
文章编号111560
期刊Pattern Recognition
164
DOI
出版状态已出版 - 8月 2025

指纹

探究 'Attention-driven acoustic properties learning for underwater target ranging' 的科研主题。它们共同构成独一无二的指纹。

引用此