Official code for "GraphFM: A Comprehensive Benchmark for Graph Foundation Model". GraphFM is a comprehensive benchmark for Graph Foundation Models (Graph FMs) and is based on graph self-supervised learning (GSSL). It aims to research the homogenization and scalability of Graph FMs.
GraphFM provides a fair and comprehensive platform to evaluate existing GSSL works and facilitate future research.
We perform a comprehensive benchmark of state-of-the-art self-supervised GNN models through four key aspects: dataset scale, training strategies, GSSL methods for Graph FMs, and adaptability to different downstream tasks.
The required packages can be installed by running pip install -r requirements.txt
.
model :
BGRL
, CCA-SSG
, GBT
, GCA
, GraphECL
, GraphMAE
, GraphMAE2
, S2GAE
dataset :
cora
, pubmed
, citeseer
, Flickr
, Reddit
, ogbn-arxiv
batch_type :
full_batch
, node_sampling
, subgraph_sampling
You can run the python main_optuna.py --type_model $model --dataset $dataset --batch_type $batch_type
to get the best hyperparameters.
You can train the model with main.py
after obtaining the hyperparameters tuned by Optuna.
2024.6.15 Submitted our paper to arXiv.
ID | Paper | Method | Conference |
---|---|---|---|
1 | Large-Scale Representation Learning on Graphs via Bootstrapping | BGRL | ICLR 2022 |
2 | From Canonical Correlation Analysis to Self-supervised Graph Neural Networks | CCA-SSG | NeurIPS 2021 |
3 | Graph Barlow Twins: A self-supervised representation learning framework for graphs | GBT | Knowledge-Based Systems 2022 |
4 | Graph Contrastive Learning with Adaptive Augmentation | GCA | WWW 2021 |
5 | GraphECL: Towards Efficient Contrastive Learning for Graphs | GraphECL | Under Review |
6 | GraphMAE: Self-Supervised Masked Graph Autoencoders | GraphMAE | KDD 2022 |
7 | GraphMAE2: A Decoding-Enhanced Masked Self-Supervised Graph Learner | GraphMAE2 | WWW 2023 |
8 | S2GAE: Self-Supervised Graph Autoencoders are Generalizable Learners with Graph Masking | S2GAE | WSDM 2023 |
If you find this repo useful, please star the repo and cite:
@article{xu2024graphfm,
title={GraphFM: A Comprehensive Benchmark for Graph Foundation Model},
author={Xu, Yuhao and Liu, Xinqi and Duan, Keyu and Fang, Yi and Chuang, Yu-Neng and Zha, Daochen and Tan, Qiaoyu},
journal={arXiv preprint arXiv:2406.08310},
year={2024}
}