Attention-based neural network for end-to-end music separation

Jing Wang*, Hanyue Liu, Haorong Ying, Chuhan Qiu, Jingxin Li, Muhammad Shahid Anwar*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

The end-to-end separation algorithm with superior performance in the field of speech separation has not been effectively used in music separation. Moreover, since music signals are often dual channel data with a high sampling rate, how to model long-sequence data and make rational use of the relevant information between channels is also an urgent problem to be solved. In order to solve the above problems, the performance of the end-to-end music separation algorithm is enhanced by improving the network structure. Our main contributions include the following: (1) A more reasonable densely connected U-Net is designed to capture the long-term characteristics of music, such as main melody, tone and so on. (2) On this basis, the multi-head attention and dual-path transformer are introduced in the separation module. Channel attention units are applied recursively on the feature map of each layer of the network, enabling the network to perform long-sequence separation. Experimental results show that after the introduction of the channel attention, the performance of the proposed algorithm has a stable improvement compared with the baseline system. On the MUSDB18 dataset, the average score of the separated audio exceeds that of the current best-performing music separation algorithm based on the time-frequency domain (T-F domain).

Original languageEnglish
Pages (from-to)355-363
Number of pages9
JournalCAAI Transactions on Intelligence Technology
Volume8
Issue number2
DOIs
Publication statusPublished - Jun 2023

Keywords

  • channel attention
  • densely connected network
  • end-to-end music separation

Fingerprint

Dive into the research topics of 'Attention-based neural network for end-to-end music separation'. Together they form a unique fingerprint.

Cite this