Skip to content

Latest commit

 

History

History
18 lines (10 loc) · 911 Bytes

README.md

File metadata and controls

18 lines (10 loc) · 911 Bytes

Heterogeneous Graph Transformer (HGT)

Alternative PyTorch-Geometric implementation

Heterogeneous Graph Transformer is a graph neural network architecture that can deal with large-scale heterogeneous and dynamic graphs.

This toy experiment is based on DGL's official tutorial. As the ACM datasets doesn't have input feature, we simply randomly assign features for each node. Such process can be simply replaced by any prepared features.

The reference performance against R-GCN and MLP running 5 times:

Model Test Accuracy # Parameter
2-layer HGT 0.465 ± 0.007 2,176,324
2-layer RGCN 0.392 ± 0.013 416,340
MLP 0.132 ± 0.003 200,974