Structural Bias for Aspect Sentiment Triplet Extraction

Chen Zhang, Lei Ren, Fang Ma, Jingang Wang*, Wei Wu, Dawei Song*

*此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

10 引用 (Scopus)

摘要

Structural bias has recently been exploited for aspect sentiment triplet extraction (ASTE) and led to improved performance. On the other hand, it is recognized that explicitly incorporating structural bias would have a negative impact on efficiency, whereas pretrained language models (PLMs) can already capture implicit structures. Thus, a natural question arises: Is structural bias still a necessity in the context of PLMs? To answer the question, we propose to address the efficiency issues by using an adapter to integrate structural bias in the PLM and using a cheap-to-compute relative position structure in place of the syntactic dependency structure. Benchmarking evaluation is conducted on the SemEval datasets. The results show that our proposed structural adapter is beneficial to PLMs and achieves state-of-the-art performance over a range of strong baselines, yet with a light parameter demand and low latency. Meanwhile, we give rise to the concern that the current evaluation default with data of small scale is under-confident. Consequently, we release a large-scale dataset for ASTE. The results on the new dataset hint that the structural adapter is confidently effective and efficient to a large scale. Overall, we draw the conclusion that structural bias shall still be a necessity even with PLMs.

源语言英语
页(从-至)6736-6745
页数10
期刊Proceedings - International Conference on Computational Linguistics, COLING
29
1
出版状态已出版 - 2022
活动29th International Conference on Computational Linguistics, COLING 2022 - Gyeongju, 韩国
期限: 12 10月 202217 10月 2022

指纹

探究 'Structural Bias for Aspect Sentiment Triplet Extraction' 的科研主题。它们共同构成独一无二的指纹。

引用此