Journal of South China University of Technology (Natural Science Edition) ›› 2020, Vol. 48 ›› Issue (6): 97-105.doi: 10.12141/j.issn.1000-565X.190830

• Computer Science & Technology • Previous Articles     Next Articles

Joint Deep Recommendation Model Based on Double-Layer Attention Mechanism

LIU Huiting JI Qiang LIU Huimin ZHAO Peng   

  1. School of Computer Science and Technology,Anhui University,Hefei 230601,Anhui,China
  • Received:2019-11-12 Revised:2020-01-29 Online:2020-06-25 Published:2020-06-01
  • Contact: 刘慧婷(1978-),女,博士,副教授,主要从事自然语言处理和个性化推荐研究。 E-mail:htliu@ahu.edu.cn
  • About author:刘慧婷(1978-),女,博士,副教授,主要从事自然语言处理和个性化推荐研究。
  • Supported by:
    Supported by the National Natural Science Foundation of China (61202227,61602004) and the Natural Sci-ence Research Project of Colleges and Universities in Anhui Province (KJ2018A0013)

Abstract: Many e-commerce websites keep a large amount of customer’s reviews. The reviews was exploited by most recommendation systems by only considering their importance in the word-level rather than in the comment-level. The effectiveness of the recommendation model will be reduced by exclusively considering important words in the reviews and ignoring really useful reviews. Based on this,a joint deep recommendation model based on double-layer attention mechanism (DLALSTM) was proposed. First,DLALSTM uses bidirectional long short-term memory network (BiLSTM) to jointly model the customer and reviews from both word and customer levels,and aggregates the reviews representation and the customer/item representation by a double-layer attention mechanism. Then,the latent representation of customer and item learned from the reviews was incorporated into the customer preference and item feature obtained from the rating matrix to make rating prediction. DLALSTM was compared with the com-monly used recommendation methods through experimental evaluation on different domain datasets of Yelp and Ama-zon. It finds that the performance of DLALSTM exceeds the state of art recommended methods. Meanwhile,the pro-posed model can alleviate the sparsity problem to some extent and has good interpretability.

Key words: attention mechanism, bidirectional long short-term memory network, recommendation system, deep learning