TY - GEN
T1 - Aspect Term Extraction via Dynamic Attention and a Densely Connected Graph Convolutional Network
AU - Sun, Xin
AU - Mi, Yongqing
AU - Liu, Jia
AU - Li, Hongao
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
PY - 2025
Y1 - 2025
N2 - Aspect term extraction is a crucial step in aspect-level sentiment analysis, significantly affecting the accuracy of sentiment classification. Therefore, improving the precision of aspect term extraction is vital for enhancing the performance of sentiment analysis. The limitations of existing methods include inadequate consideration of syntactic information and inter-word dependencies, as well as the challenge of mitigating weight noise during dependency tree conversion. To address these issues, we propose an aspect term extraction approach that leverages dynamic attention and graph convolutional network. Our method utilizes a densely connected graph convolutional network to capture dependency information between distant terms, thereby enriching vector semantics. Furthermore, it integrates a dynamic attention mechanism informed by dependency parsing to highlight critical dependencies and mitigate noise interference. We benchmark our model against state-of-the-art approaches on four widely used public datasets. The results indicate that our proposed method significantly enhances the performance of aspect term extraction. Specifically, our model improves upon baseline models on the Lap14 and Rest15 datasets, with increases in macro-F1 scores of 0.45, and 0.04, respectively.
AB - Aspect term extraction is a crucial step in aspect-level sentiment analysis, significantly affecting the accuracy of sentiment classification. Therefore, improving the precision of aspect term extraction is vital for enhancing the performance of sentiment analysis. The limitations of existing methods include inadequate consideration of syntactic information and inter-word dependencies, as well as the challenge of mitigating weight noise during dependency tree conversion. To address these issues, we propose an aspect term extraction approach that leverages dynamic attention and graph convolutional network. Our method utilizes a densely connected graph convolutional network to capture dependency information between distant terms, thereby enriching vector semantics. Furthermore, it integrates a dynamic attention mechanism informed by dependency parsing to highlight critical dependencies and mitigate noise interference. We benchmark our model against state-of-the-art approaches on four widely used public datasets. The results indicate that our proposed method significantly enhances the performance of aspect term extraction. Specifically, our model improves upon baseline models on the Lap14 and Rest15 datasets, with increases in macro-F1 scores of 0.45, and 0.04, respectively.
KW - Aspect term extraction
KW - Dynamic attention mechanism
KW - Graph convolutional network
UR - http://www.scopus.com/inward/record.url?scp=85210146879&partnerID=8YFLogxK
U2 - 10.1007/978-981-96-0116-5_32
DO - 10.1007/978-981-96-0116-5_32
M3 - Conference contribution
AN - SCOPUS:85210146879
SN - 9789819601158
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 383
EP - 395
BT - PRICAI 2024
A2 - Hadfi, Rafik
A2 - Ito, Takayuki
A2 - Anthony, Patricia
A2 - Sharma, Alok
A2 - Bai, Quan
PB - Springer Science and Business Media Deutschland GmbH
T2 - 21st Pacific Rim International Conference on Artificial Intelligence, PRICAI 2024
Y2 - 18 November 2024 through 24 November 2024
ER -