An Attentive Memory Network Integrated with Aspect Dependency for Document-Level Multi-Aspect Sentiment Classification

Qingxuan Zhang, Chongyang Shi*

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

4 Citations (Scopus)

Abstract

Document-level multi-aspect sentiment classification is one of the foundational tasks in natural language processing (NLP) and neural network methods have achieved great success in reviews sentiment classification. Most of recent works ignore the relation between different aspects and do not take into account the contexting dependent importance of sentences and aspect keyw ords. In this paper, we propose an attentive memory network for document-level multi-aspect sentiment classification. Unlike recent proposed models which average word embeddings of aspect keywords to represent aspect and utilize hierarchical architectures to encode review documents, we adopt attention-based memory networks to construct aspect and sentence memories. The recurrent attention operation is employed to capture long-distance dependency across sentences and obtain aspect-aware document representations over aspect and sentence memories. Then, incorporating the neighboring aspects related information into the final aspect rating predictions by using multi-hop attention memory networks. Experimental results on two real-world datasets TripAdvisor and BeerAdvocate show that our model achieves state-of-the-art performance.

Original languageEnglish
Pages (from-to)425-440
Number of pages16
JournalProceedings of Machine Learning Research
Volume101
Publication statusPublished - 2019
Event11th Asian Conference on Machine Learning, ACML 2019 - Nagoya, Japan
Duration: 17 Nov 201919 Nov 2019

Keywords

  • Aspect dependency
  • Memory network
  • Multi-aspect sentiment classification

Fingerprint

Dive into the research topics of 'An Attentive Memory Network Integrated with Aspect Dependency for Document-Level Multi-Aspect Sentiment Classification'. Together they form a unique fingerprint.

Cite this