Supervision: Eda Bayram

Project type: Semester project (master)



Node attribute completion in knowledge graphs.


In recent years, only a few graph representation learning (GRL) approaches have been designed for heterogeneous graphs [1]. An example is a knowledge graph (KG), which constitutes multiple types of node and edge attributes. Many AI applications exploit KGs where the major task is to complete the missing facts. In this project, we particularly address the task of node attribute completion in KGs. It is basically framed as predicting missing node attributes by reasoning over the observed ones and the given heterogeneous structure of the KG.


Node attribute completion in KGs is not as straightforward as node regression in simple graphs: we need to adapt the message passing strategy to regress multiple types of node attributes through the multiple types of relations in a KG. Thus, we work with a heterogenous message-passing scheme [2]. Here, a message-passing path is described by a triplet as <source node-attribute type, relation type connecting source node to target node, target node-attribute type>. Accordingly, the computation graph of message passing is different from the input graph yet an augmented version of it. While the message transformation function is to be learned special to the corresponding message passing path, a heterogeneous attention mechanism [3] is essential to decide the importance of the incoming messages during aggregation.


Python fluency, familiarity with ML with PyTorch, enthusiasm in GRL


  1. Michael Schlichtkrull, Thomas N Kipf, Peter Bloem, Rianne Van Den Berg, Ivan Titov, and Max Welling, “Modeling relational data with graph convolutional networks,” in European Semantic Web Conference. Springer, 2018, pp. 593–607.
  2. Eda Bayram, Alberto Garcia-Duran, and Robert West. "Node Attribute Completion in Knowledge Graphs with Multi-Relational Propagation." arXiv preprint arXiv:2011.05301 (2020).
  3. Ziniu Hu, Yuxiao Dong, Kuansan Wang, and Yizhou Sun, “Heterogeneous graph transformer,” in Proceedings of The Web Conference 2020, 2020, pp. 2704–2710.