Methods to Enhance BERT in Aspect-Based Sentiment Classification

Yufeng Zhao, Evelyn Soerjodjojo, Haiying Che*

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

BERT is a widely used pre-trained model in Natural Language Processing tasks, including Aspect-Based sentiment classification. BERT is equipped with sufficient prior language knowledge in the enormous amount of pre-trained model parameters, for which the fine-tuning of BERT has become a critical issue. Previous works mainly focused on specialized downstream networks or additional knowledge to fine-tune the BERT to the sentiment classification tasks. In this paper, we design experiments to find the fine-tuning techniques that can be used by all models with BERT in the Aspect-Based Sentiment Classification tasks. Through these experiments, we verify different feature extraction, regularization, and continual learning methods, then we summarize 8 universally applicable conclusions to enhance the training and performance of the BERT model.

源语言英语
主期刊名Proceedings - 2022 Euro-Asia Conference on Frontiers of Computer Science and Information Technology, FCSIT 2022
出版商Institute of Electrical and Electronics Engineers Inc.
21-27
页数7
ISBN(电子版)9781665463539
DOI
出版状态已出版 - 2022
活动2022 Euro-Asia Conference on Frontiers of Computer Science and Information Technology, FCSIT 2022 - Beijing, 中国
期限: 16 12月 202218 12月 2022

出版系列

姓名Proceedings - 2022 Euro-Asia Conference on Frontiers of Computer Science and Information Technology, FCSIT 2022

会议

会议2022 Euro-Asia Conference on Frontiers of Computer Science and Information Technology, FCSIT 2022
国家/地区中国
Beijing
时期16/12/2218/12/22

指纹

探究 'Methods to Enhance BERT in Aspect-Based Sentiment Classification' 的科研主题。它们共同构成独一无二的指纹。

引用此