Skip to content

The source code for GraphFM: A comprehensive benchmark for Graph Foundation Model

Notifications You must be signed in to change notification settings

YiFang99/GraphFM

Repository files navigation

GraphFM

Official code for "GraphFM: A Comprehensive Benchmark for Graph Foundation Model". GraphFM is a comprehensive benchmark for Graph Foundation Models (Graph FMs) and is based on graph self-supervised learning (GSSL). It aims to research the homogenization and scalability of Graph FMs.

Overview of the Benchmark

GraphFM provides a fair and comprehensive platform to evaluate existing GSSL works and facilitate future research.

architecture

We perform a comprehensive benchmark of state-of-the-art self-supervised GNN models through four key aspects: dataset scale, training strategies, GSSL methods for Graph FMs, and adaptability to different downstream tasks.

Installation

The required packages can be installed by running pip install -r requirements.txt.

🚀Quick Start

Set up Model, Dataset and Batch type parameters

modelBGRL, CCA-SSG, GBT, GCA, GraphECL, GraphMAE, GraphMAE2, S2GAE

datasetcora, pubmed, citeseer, Flickr, Reddit, ogbn-arxiv

batch_typefull_batch, node_sampling, subgraph_sampling

Get Best Hyperparameters

You can run the python main_optuna.py --type_model $model --dataset $dataset --batch_type $batch_type to get the best hyperparameters.

Train the Main Code

You can train the model with main.py after obtaining the hyperparameters tuned by Optuna.

📱️Updates

2024.6.15 Submitted our paper to arXiv.

Reference

ID Paper Method Conference
1 Large-Scale Representation Learning on Graphs via Bootstrapping BGRL ICLR 2022
2 From Canonical Correlation Analysis to Self-supervised Graph Neural Networks CCA-SSG NeurIPS 2021
3 Graph Barlow Twins: A self-supervised representation learning framework for graphs GBT Knowledge-Based Systems 2022
4 Graph Contrastive Learning with Adaptive Augmentation GCA WWW 2021
5 GraphECL: Towards Efficient Contrastive Learning for Graphs GraphECL Under Review
6 GraphMAE: Self-Supervised Masked Graph Autoencoders GraphMAE KDD 2022
7 GraphMAE2: A Decoding-Enhanced Masked Self-Supervised Graph Learner GraphMAE2 WWW 2023
8 S2GAE: Self-Supervised Graph Autoencoders are Generalizable Learners with Graph Masking S2GAE WSDM 2023

Citation

If you find this repo useful, please star the repo and cite:

@article{xu2024graphfm,
      title={GraphFM: A Comprehensive Benchmark for Graph Foundation Model},
      author={Xu, Yuhao and Liu, Xinqi and Duan, Keyu and Fang, Yi and Chuang, Yu-Neng and Zha, Daochen and Tan, Qiaoyu},
      journal={arXiv preprint arXiv:2406.08310},
      year={2024}
    }

About

The source code for GraphFM: A comprehensive benchmark for Graph Foundation Model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published