Opinion Mining on Product Review Based on PM-LDA

Zhenni Wu, Jianyun Shang, Huaping Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

An updated framework based on LDA is provided in this paper to extract features from online user reviews which are in Chinese. This model is an extension of the LDA by introducing the concepts of multi-gram and part of speech into it and it is named PM-LDA. Through this model, features are generated as topics and topic labels can be generated as the sentence that has the max topic probability. Topics in PM-LDA are divided into two different types. The one is some general features such as product brand, color or producing area, and the other is those latent characteristics which customers may be more interested in. Part of speech is used to get the feature object and feature opinion separately, to make it more accurate. Several experiments are carried out to help to evaluate this model and it is indicated that this method has improved performance both in accuracy and the quality of the topic model itself.

Original languageEnglish
Title of host publication2018 1st Asian Conference on Affective Computing and Intelligent Interaction, ACII Asia 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538653111
DOIs
Publication statusPublished - 21 Sept 2018
Event1st Asian Conference on Affective Computing and Intelligent Interaction, ACII Asia 2018 - Beijing, China
Duration: 20 May 201822 May 2018

Publication series

Name2018 1st Asian Conference on Affective Computing and Intelligent Interaction, ACII Asia 2018

Conference

Conference1st Asian Conference on Affective Computing and Intelligent Interaction, ACII Asia 2018
Country/TerritoryChina
CityBeijing
Period20/05/1822/05/18

Keywords

  • Feature extraction
  • LDA
  • Opinion mining
  • Product review

Fingerprint

Dive into the research topics of 'Opinion Mining on Product Review Based on PM-LDA'. Together they form a unique fingerprint.

Cite this