Accepted Paper: An Attentive Memory Network Integrated with Aspect Dependency for Document-Level Multi-Aspect Sentiment Classification

Back to list of accepted papers


Qingxuan Zhang (Beijing Institute of Technology); Chongyang Shi (Beijing Institute of Technology)


Document-level multi-aspect sentiment classification is one of the foundational tasks in natural language processing (NLP) and neural network methods have achieved great success in reviews sentiment classification. Most of recent works ignore the relation between different aspects and do not take into account the contexting dependent importance of sentences and aspect keywords. In this paper, we propose an attentive memory network for document-level multi-aspect sentiment classification. Unlike recent proposed models which average word embeddings of aspect keywords to represent aspect and utilize hierarchical architectures to encode review documents, we adopt attention-based memory networks to construct aspect and sentence memories. The recurrent attention operation is employed to capture long-distance dependency across sentences and obtain aspect-aware document representations over aspect and sentence memories. Then, incorporating the neighboring aspects related information into the final aspect rating predictions by using multi-hop attention memory networks. Experimental results on two real-world datasets TripAdvisor and BeerAdvocate show that our model achieves state-of-the-art performance.