Accepted Paper: A Model of Text-Enhanced Knowledge Graph Representation Learning with Mutual Attention

Back to list of accepted papers

Authors

Yashen Wang (China Academy of Electronics and Information Technology of CETC)

Abstract

This paper proposes an accurate text-enhanced knowledge graph (KG) representation model, which can utilize textual information to enhance the knowledge representations. Especially, a mutual attention mechanism between KG and text is proposed to learn more accurate textual representations for further improving knowledge graph representation, within a unified parameter sharing semantic space. Different from conventional joint models, no complicated linguistic analysis or strict alignments between KG and text are required to train our model. Experimental results show that the proposed model achieves the state-of-the-art performance on both link prediction and triple classification tasks, and significantly outperforms previous text-enhanced knowledge representation models.