Learning Knowledge Graph Embedding With Heterogeneous Relation Attention Networks

Abstract : Knowledge graph (KG) embedding aims to study the embedding representation to retain the inherent structure of KGs. Graph neural networks (GNNs), as an effective graph representation technique, have shown impressive performance in learning graph embedding. However, KGs have an intrinsic property of heterogeneity, which contains various types of entities and relations. How to address complex graph data and aggregate multiple types of semantic information simultaneously is a critical issue. In this article, a novel heterogeneous GNNs framework based on attention mechanism is proposed. Specifically, the neighbor features of an entity are first aggregated under each relation-path. Then the importance of different relation-paths is learned through the relation features. Finally, each relation-path-based features with the learned weight values are aggregated to generate the embedding representation. Thus, the proposed method not only aggregates entity features from different semantic aspects but also allocates appropriate weights to them. This method can capture various types of semantic information and selectively aggregate informative features. The experiment results on three real-world KGs demonstrate superior performance when compared with several state-of-the-art methods.
 EXISTING SYSTEM :
 ? In this paper, we provide a comprehensive survey on KG-embedding models for link prediction in knowledge graphs. ? We first provide a theoretical analysis and comparison of existing methods proposed to date for generating KG embedding. ? The solid lines in the left figure are existing relations, and the dotted lines are possible relations. The different colors in the right figure represent various possible relations, which are calculated by the link-prediction task. ? In this paper, we comprehensively survey existing KGE models and categorize them into three groups: translational-distance-based models, semantic-matching-based models and neural-network-based models.
 DISADVANTAGE :
 ? Graph convolution based methods overcome this problem by aggregating the features from the neighboring entities and applying a transformation function to compute the new features. ? But these graph-based methods give equal weights to each of the neighboring entities, ignoring that the neighbors have different significance in computing new features. ? This attention mechanism considers edges having the same type, thus it cannot be directly extended to knowledge graphs which have multiple relation types between entities. ? This disadvantage is overcome by RelAtt as it employs an attention mechanism that helps in learning better representations. RelAtt performs better even when less contextual information is present.
 PROPOSED SYSTEM :
 • In particular, many KGE models fusing external information have been proposed in recent years, in which this information is diversified. • However, this model does not better classify and summarize from the perspective of integrated information.We first performed a comprehensive investigation on the KGE models proposed in recent years. • Semantic information-based models usually use similarity-based functions to define scoring functions for traditional semantic-matching models or introduce additional information to mine more knowledge for recently proposed model. • The former recently proposed models fuse various additional information to obtain better performance to mine deeper semantic information at the bottoms of graph.
 ADVANTAGE :
 ? It can be seen in the case of RGCN, as value of ??h increases from 0.2 to 0.5, the performance of RGCN improves because of the increased amount of context information. ? At lower values of contextual information (0.2), RGCN performance gets worse as compared to BERT, possibly because RGCN does not differentiate between neighbors. ? We evaluate our proposed approach on three datasets - two widely used public datasets (FB15k-237 and WN18) and one proprietary dataset (Comp). FB15k-237 is obtained from FB15k by removing inverse relations; which is a subset of relational database FreeBase, containing general fact ? The higher values of MRR and Hits@K indicate the better performance of the model.

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Mail us : info@nibode.com