Geometric graph is a special kind of graph with geometric features, which is vital to model many scientific problems. Unlike generic graphs, geometric graphs often exhibit physical symmetries of translations, rotations, and reflections, making them ineffectively processed by current Graph Neural Networks (GNNs). To tackle this issue, researchers proposed a variety of Geometric Graph Neural Networks equipped with invariant/equivariant properties to better characterize the geometry and topology of geometric graphs. Given the current progress in this field, it is imperative to conduct a comprehensive survey of data structures, models, and applications related to geometric GNNs. In this paper, based on the necessary but concise mathematical preliminaries, we provide a unified view of existing models from the geometric message passing perspective. Additionally, we summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation. We also discuss the challenges and future potential directions of Geometric GNNs at the end of this survey.
- A Survey of Geometric Graph Neural Networks: Data Structures, Models and Applications
- [NIPS'17] SchNet: A continuous-filter convolutional neural network for modeling quantum interactions
- [ICLR'20] DimeNet: Directional Message Passing for Molecular Graphs
- [arXiv:2011.14115] DimeNet++: Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules
- [ICPP'23] FastDimeNet++: Training DimeNet++ in 22 minutes
- [ICML'20] LieConv: Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data
- [NeurIPS'21] GemNet: Universal Directional Graph Neural Networks for Molecules
- [ICLR'22] SphereNet: Spherical Message Passing for 3D Molecular Graphs
- [NeurIPS'22] ComENet: Towards Complete and Efficient Message Passing for 3D Molecular Graphs
- [AAAI'24] QMP: A Plug-and-Play Quaternion Message-Passing Module for Molecular Conformation Representation
- [arXiv:1910.00753] Radial Field: Equivariant Flows: sampling configurations for multi-body systems with symmetric energies
- [ICLR'21] GVP-GNN: Learning from Protein Structure with Geometric Vector Perceptrons
- [ICML'21] EGNN: E(n) Equivariant Graph Neural Networks
- [ICML'21] PaiNN: Equivariant message passing for the prediction of tensorial properties and molecular spectra
- [NeurIPS'21] LoCS: Roto-translated Local Coordinate Frames For Interacting Dynamical Systems
- [NeurIPS'23] G-LoCS: Latent Field Discovery In Interacting Dynamical Systems With Neural Fields
- [ICLR'22] GMN: Equivariant Graph Mechanics Networks with Constraints
- [ICLR'22] Frame-Averaging: Frame Averaging for Invariant and Equivariant Network Design
- [ICML'22] ClofNet: SE (3) Equivariant Graph Neural Networks with Complete Local Frames
- [NeurIPS'22] EGHN: Equivariant Graph Hierarchy-Based Neural Networks
- [NeurIPS'23] LEFTNet: A new perspective on building efficient and expressive 3D equivariant graph neural networks
- [ICML'24] FastEGNN: Improving Equivariant Graph Neural Networks on Large Geometric Graphs via Virtual Nodes Learning
- [NC'2401] ViSNet: Enhancing geometric representations for molecules with equivariant vector-scalar interactive message passing
- [NeurIPS'24] Neural P$^3$M: Neural P$^3$M: A Long-Range Interaction Modeling Enhancer for Geometric GNNs
- [NeurIPS'24] HEGNN: Are High-Degree Representations Really Unnecessary in Equivarinat Graph Neural Networks?
- [arXiv:1802.08219] Tensor field networks: Rotation- and translation-equivariant neural networks for 3D point clouds
- [NeurIPS'19] Cormorant: Covariant Molecular Neural Networks
- [ICLR'22] SEGNN: Geometric and Physical Quantities Improve E(3) Equivariant Message Passing
- [NC'2205] NequIP: E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials
- [NeurIPS'22] SCN: Spherical Channels for Modeling Atomic Interactions
- [NeurIPS'22] MACE: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields
- [NeurIPS'20] SE(3)-Transformers: 3D Roto-Translation Equivariant Attention Networks
- [NeurIPS'21] Graphormer: Do Transformers Really Perform Bad for Graph Representation?
- [ICML'21] LieTransformer: Equivariant self-attention for Lie Groups
- [ICLR'22] TorchMD-Net: Equivariant Transformers for Neural Network based Molecular Potentials
- [ICML'22] GVP-Transformer: Learning inverse folding from millions of predicted structures
- [NeurIPS'22] Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
- [NeurIPS'23] EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations
- [NeurIPS'22] So3krates: So3krates: Equivariant attention for interactions on arbitrary length-scales in molecular systems
- [NC'2408] So3krates: A Euclidean transformer for fast and stable machine learned force fields
- [NeurIPS'23] Geoformer: Geometric Transformer with Interatomic Positional Encoding
- [arXiv:2402.12714v1] EPT: Equivariant Pretrained Transformer for Unified Geometric Learning on Multi-Domain 3D Molecules
Datasets:
$N$ -Body——NRI: Neural relational inference for interacting systems- 3D
$N$ -Body——EGNN: E(n) equivariant graph neural networks- Constrained
$N$ -Body——GMN: Equivariant graph mechanics networks with constraints- Hierarchical
$N$ -Body——EGHN: Equivariant graph hierarchy-based neural networks
Methods:
- NRI: Neural Relational Inference for Interacting Systems
- IN: Interaction networks for learning about objects, relations and physics
- E-NFs: E(n) Equivariant Normalizing Flows
- EGNN: E(n) equivariant graph neural networks
- SEGNNs: Geometric and Physical Quantities improve E(3) Equivariant Message Passing
- GMN: Equivariant Graph Mechanics Networks with Constraints
- EGHN: Equivariant Graph Hierarchy-Based Neural Networks
- HOGN: Hamiltonian Graph Networks with ODE Integrators
- NCGNN: Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic Systems
Datasets:
- Physion——SGNN: Learning Physical Dynamics with Subequivariant Graph Neural Networks
- Kubric MOVi-A——GNS*: Graph network simulators can learn discontinuous, rigid contact dynamics
- FluidFall & FluidShake & BoxBath & RiceGrip——DPI-Net: Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids
- Water3D——GNS: Learning to simulate complex physics with graph networks
- MIT Pushing——FIGNet: Learning rigid dynamics with face interaction graph networks
Methods:
- SGNN: Learning Physical Dynamics with Subequivariant Graph Neural Networks
- GNS: Learning to simulate complex physics with graph networks
- C-GNS: Constraint-based graph network simulator
- GNS*: Graph network simulators can learn discontinuous, rigid contact dynamics
- HGNS: Learning large-scale subsurface simulations with a hybrid graph network simulator
- DPI-Net: Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids
- HRN: Flexible Neural Representation for Physics Prediction
- FIGNet: Learning rigid dynamics with face interaction graph networks
- EGHN: Equivariant Graph Hierarchy-Based Neural Networks
- LoCS: Roto-translated Local Coordinate Frames For Interacting Dynamical Systems
- EqMotion: Equivariant Multi-agent Motion Prediction with Invariant Interaction Reasoning
- ESTAG: Equivariant Spatio-Temporal Attentive Graph Networks to Simulate Physical Dynamics
- SEGNO: Improving Generalization in Equivariant Graph Neural Networks with Physical Inductive Biases
Datasets:
Methods:
- Cormorant: Covariant Molecular Neural Networks
- TFN: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds
- SE(3)-Transformers: 3D Roto-Translation Equivariant Attention Networks
- NequIP:E(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials
- SEGNNs:Geometric and Physical Quantities improve E(3) Equivariant Message Passing
- LieConv: Generalizing convolutional neural networks for equivariance to lie groups on arbitrary continuous data
- Lietransformer: equivariant self-attention for lie groups
- Schnet--a deep learning architecture for molecules and materials
- DimeNet: Directional Message Passing for Molecular Graphs
- GemNet: Universal Directional Graph Neural Networks for Molecules
- PaiNN: Equivariant message passing for the prediction of tensorial properties and molecular spectra
- TorchMD-NET: Equivariant Transformers for Neural Network based Molecular Potentials
- Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
- SphereNet: Learning Spherical Representations for Detection and Classification in Omnidirectional Images
- EGNN: E(n) equivariant graph neural networks
- Graphormer: Do Transformers Really Perform Bad for Graph Representation?
- SCN: Spherical channels for modeling atomic interactions
- eSCN: Reducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs
Datasets:
- MD17——GMN: Equivariant Graph Mechanics Networks with Constraints
- OCP——GemNet: Universal Directional Graph Neural Networks for Molecules
- Adk——EGHN: Equivariant Graph Hierarchy-Based Neural Networks
- DW-4 & LJ-13——E-CNF: Equivariant Flows: Exact Likelihood Generative Learning for Symmetric Densities
- Fast-folding proteins——ITO: Implicit Transfer Operator Learning: Multiple Time-Resolution Models for Molecular Dynamics
Methods:
- EGNN: E(n) equivariant graph neural networks
- NequIP:E(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials
- GMN: Equivariant Graph Mechanics Networks with Constraints
- EGHN: Equivariant Graph Hierarchy-Based Neural Networks
- NCGNN: Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic Systems
- ESTAG: Equivariant Spatio-Temporal Attentive Graph Networks to Simulate Physical Dynamics
- SEGNO: Improving Generalization in Equivariant Graph Neural Networks with Physical Inductive Biases
- ITO: Implicit Transfer Operator Learning: Multiple Time-Resolution Models for Molecular Dynamics
- E-CNF: Equivariant Flows: Exact Likelihood Generative Learning for Symmetric Densities
- E-ACF: SE (3) equivariant augmented coupling flows
Datasets:
Methods:
- GeoDiff: A Geometric Diffusion Model for Molecular Conformation Generation
- GeoLDM: Geometric Latent Diffusion Models for 3D Molecule Generation
- ConfVAE: An End-to-End Framework for Molecular Conformation Generation via Bilevel Programming
- ConfGF: Learning Gradient Fields for Molecular Conformation Generation
- G-SchNet: Symmetry-adapted generation of 3d point sets for the targeted discovery of molecules
- cG-SchNet: Inverse design of 3d molecular structures with conditional generative neural networks
- DGSM: Predicting Molecular Conformation via Dynamic Graph Score Matching
- E-NFs: E(n) Equivariant Normalizing Flows
- EDM: Equivariant diffusion for molecule generation in 3d
- GeoMol: Torsional Geometric Generation of Molecular 3D Conformer Ensembles
- Torsional Diffusion: Torsional Diffusion for Molecular Conformer Generation
- EEGSDE: Equivariant Energy-Guided {SDE} for Inverse Molecular Design
- DMCG: Direct molecular conformation generation
- MDM: Molecular Diffusion Model for 3D Molecule Generation
- MolDiff: Addressing the Atom-Bond Inconsistency Problem in 3D Molecule Diffusion Generation
- EquiFM: Equivariant Flow Matching with Hybrid Probability Transport for 3D Molecule Generation
- Hierdiff: Coarse-to-Fine: a Hierarchical Diffusion Model for Molecule Generation in 3D
- MPerformer: An SE (3) Transformer-based Molecular Perceptron
Datasets:
- QM9——3D-Infomax: 3d infomax improves gnns for molecular property prediction
- GEOM & QMugs——3D-Infomax: 3d infomax improves gnns for molecular property prediction
- PCQM4Mv2——3D-PGT: Automated 3D pre-training for molecular property prediction
- Uni-Mol——Uni-Mol: A Universal 3D Molecular Representation Learning Framework
Methods:
- 3D-EMGP: Energy-Motivated Equivariant Pretraining for 3D Molecular Graphs
- GeoSSL-DDM: Molecular Geometry Pretraining with {SE}(3)-Invariant Denoising Distance Matching
- GraphMVP: Pre-training Molecular Graph Representation with 3D Geometry
- GNS-TAT: Pre-training via Denoising for Molecular Property Prediction
- 3D-Infomax: 3d infomax improves gnns for molecular property prediction
- Uni-Mol: A Universal 3D Molecular Representation Learning Framework
- Transformer-M: One transformer can understand both 2d & 3d molecular data
- SliDe: Sliced Denoising: A Physics-Informed Molecular Pre-Training Method
- Frad: Fractional Denoising for 3D Molecular Pre-training
- MGMAE: Molecular Representation Learning by Reconstructing Heterogeneous Graphs with A High Mask Ratio
- MoleculeSDE: A Group Symmetric Stochastic Differential Equation Model for Molecule Multi-modal Pretraining
Datasets:
- GENE Ontology——GearNet : Protein representation learning by geometric structure pretraining
- ENZYME——GearNet : Protein representation learning by geometric structure pretraining
- SCOPe——TAPE: Evaluating protein transfer learning with TAPE
- UniProt——DeepLoc: prediction of protein subcellular localization using deep learning
- PDB——ATOM3D: Tasks On Molecules in Three Dimensions
Methods:
- DeepFRI: Structure-based protein function prediction using graph convolutional network
- LM-GVP: an extensible sequence and structure informed deep learning framework for protein property prediction
- GearNet : Protein representation learning by geometric structure pretraining
- 3DCNN: 3D deep convolutional neural networks for amino acid environment similarity analysis
- TM-align: a protein structure alignment algorithm based on the TM-score
- GVP: Learning from Protein Structure with Geometric Vector Perceptrons
- PAUL: Hierarchical rotation-equivariant neural networks to select structural models of protein complex
- EDN: Protein model quality assessment using rotation-equivariant transformations on point clouds
- EnQA: 3D-equivariant graph neural networks for protein model quality assessment
- ScanNet: an interpretable geometric deep learning model for structure-based protein binding site prediction
- PocketMiner: Predicting locations of cryptic pockets from single protein structures using the PocketMiner graph neural network
- EquiPocket: an E(3)-Equivariant Geometric Graph Neural Network for Ligand Binding Site Prediction
Datasets:
- CATH——GVP: Learning from Protein Structure with Geometric Vector Perceptrons
- SCOPe——ProstT5: Bilingual Language Model for Protein Sequence and Structure
- AlphaFoldDB & PDB & ESM Atlas——ESMFold: Evolutionary-scale prediction of atomic-level protein structure with a language model
- CASP——ATOM3D: Tasks On Molecules in Three Dimensions
Methods:
- GVP: Learning from Protein Structure with Geometric Vector Perceptrons
- Generative models for graph-based protein design
- ESM-IF1: Learning inverse folding from millions of predicted structures
- GCA: Generative de novo protein design with global context
- ProteinMPNN: Robust deep learning based protein sequence design using ProteinMPNN
- PiFold: Toward effective and efficient protein inverse folding
- LM-Design: Structure-informed Language Models Are Protein Designers
Methods:
- AlphaFold: Improved protein structure prediction using potentials from deep learning
- AlphaFold2: Highly accurate protein structure prediction with AlphaFold)
- RoseTTAFold: Accurate prediction of protein structures and interactions using a three-track neural network
- RoseTTAFold2: Efficient and accurate prediction of protein structure using RoseTTAFold2
- RFAA: Generalized Biomolecular Modeling and Design with RoseTTAFold All-Atom
- RFdiffusion: De novo design of protein structure and function with RFdiffusion
- EIGENFOLD: GENERATIVE PROTEIN STRUCTURE PREDICTION WITH DIFFUSION MODELS
- Chroma: Illuminating protein space with a programmable generative model
- ESMFold: Evolutionary-scale prediction of atomic-level protein structure with a language model
- HelixFold-Single: MSA-free Protein Structure Prediction by Using Protein Language Model as an Alternative
Methods:
Datasets:
- CATH——S2F: Multimodal pre-training model for sequence-based prediction of protein-protein interaction
- SCOPe——ProSE: Learning the protein language: Evolution, structure, and function
- AlphaFoldDB——GearNet : Protein representation learning by geometric structure pretraining
- UniProt & BFD——Prottrans: Toward understanding the language of life through self-supervised learning
- NetSurfP-2.0——Peer: a comprehensive and multi-task benchmark for protein sequence understanding
Methods:
- ProtTrans: Toward understanding the language of life through self-supervised learning
- ProtGPT2: ProtGPT2 is a deep unsupervised language model for protein design
- PromptProtein: Multi-level Protein Structure Pre-training via Prompt Learning
- GearNet: Protein representation learning by geometric structure pretraining xTrimoPGLM: Unified 100B-Scale Pre-trained Transformer for Deciphering the Language of Protein
- ProFSA: Self-supervised Pocket Pretraining via Protein Fragment-Surroundings Alignment
- DrugCLIP: Contrastive Protein-Molecule Representation Learning for Virtual Screening
- Self-supervised Pocket Pretraining via Protein Fragment-Surroundings Alignment
- HJRSS: Toward More General Embeddings for Protein Design: Harnessing Joint Representations of Sequence and Structure
- ESM-1b: Biological structure and function emerge from scaling unsupervised learning to 250 million protein sequences
- ESM2: Evolutionary-scale prediction of atomic-level protein structure with a language model
Datasets:
Methods:
Datasets:
Methods:
Datasets:
Methods:
- TargetDiff: 3D Equivariant Diffusion for Target-Aware Molecule Generation and Affinity Prediction
- GET: GENERALIST EQUIVARIANT TRANSFORMER TOWARDS 3D MOLECULAR INTERACTION LEARNING
- MaSIF: Deciphering interaction fingerprints from protein molecular surfaces using geometric deep learning
- ProtNet: Learning protein representations via complete 3d graph networks
- HGIN: Geometric Graph Learning for Protein Mutation Effect Prediction
- BindNet: Protein-ligand binding representation learning from fine-grained interactions
Datasets:
Methods:
- EquiBind: Geometric Deep Learning for Drug Binding Structure Prediction
- DiffDock: Diffusion Steps, Twists, and Turns for Molecular Docking
- TANKBind: Trigonometry-Aware Neural NetworKs for Drug-Protein Binding Structure Prediction
- DESERT: Zero-shot 3d drug design by sketching and generating
- FABind: Fast and Accurate Protein-Ligand Binding
Datasets:
Methods:
- Pocket2Mol: Efficient Molecular Sampling Based on 3{D} Protein Pockets
- TargetDiff: 3D Equivariant Diffusion for Target-Aware Molecule Generation and Affinity Prediction
- SBDD: A 3D Generative Model for Structure-Based Drug Design
- FLAG: Molecule Generation For Target Protein Binding with Structural Motifs
- Zero-Shot 3D Drug Design by Sketching and Generating
Datasets:
Methods:
Datasets:
- DB5.5——GET: GENERALIST EQUIVARIANT TRANSFORMER TOWARDS 3D MOLECULAR INTERACTION LEARNING
- PDBBind——GeoPPI: Deep geometric representations for modeling effects of mutations on protein-protein binding affinity
- SKEMPI 2.0——mmCSM-PPI: predicting the effects of multiple point mutations on protein–protein interactions
Methods:
Datasets:
Methods:
- Equidock: Independent {SE}(3)-Equivariant Models for End-to-End Rigid Protein Docking
- HMR: Learning Harmonic Molecular Representations on Riemannian Manifold
- HSRN: Antibody-antigen docking and design via hierarchical structure refinement
- DiffDock-PP: Rigid Protein-Protein Docking with Diffusion Models
- dMaSIF-extension: Physics-informed deep neural network for rigid-body protein docking
- AlphaFold-Multimer: Protein complex prediction with AlphaFold-Multimer
- SyNDock: N Rigid Protein Docking via Learnable Group Synchronization
- ElliDock: Rigid Protein-Protein Docking via Equivariant Elliptic-Paraboloid Interface Prediction
- EBMDock: Neural Probabilistic Protein-Protein Docking via a Differentiable Energy Model
Datasets:
- SAbDab & RAbD & Cov-abdab——RefineGNN: Iterative Refinement Graph Neural Network for Antibody Sequence-Structure Co-design
- SKEMPI 2.0——ATOM3D: Tasks On Molecules in Three Dimensions
Methods:
- DiffAb: Antigen-Specific Antibody Design and Optimization with Diffusion-Based Generative Models for Protein Structures
- MEAN: Conditional Antibody Design as 3D Equivariant Graph Translation
- dyMEAN: End-to-End Full-Atom Antibody Design
- RefineGNN: Iterative Refinement Graph Neural Network for Antibody Sequence-Structure Co-design
- PROTSEED: Protein Sequence and Structure Co-Design with Equivariant Translation
- ADesigner: Cross-Gate MLP with Protein Complex Invariant Embedding is A One-Shot Antibody Designer
- AbBERT: Incorporating Pre-training Paradigm for Antibody Sequence-Structure Co-design
- AbODE: Ab initio antibody design using conjoined ODEs
- AbDiffuser: Full-Atom Generation of In-Vitro Functioning Antibodies
- tFold: Fast and accurate modeling and design of antibody-antigen complex using tFold
Datasets:
Methods:
Datasets:
Methods:
- CGCNN: Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties
- MEGNet: Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals
- ALIGNN: Atomistic Line Graph Neural Network for Improved Materials Property Predictions
- ECN: Equivariant Networks for Crystal Structures
- Matformer: Periodic Graph Transformers for Crystal Material Property Prediction
- Crystal twins: Self-supervised Learning for Crystalline Material Property Prediction
- MMPT: A Crystal-Specific Pre-Training Framework for Crystal Material Property Prediction
- CrysDiff: A Diffusion-Based Pre-training Framework for Crystal Property Prediction
Datasets:
- Perov-5 & Carbon-24 & MP-20——CDVAE: Crystal Diffusion Variational Autoencoder for Periodic Material Generation
Methods:
- CDVAE: Crystal Diffusion Variational Autoencoder for Periodic Material Generation
- DiffCSP: Crystal Structure Prediction by Joint Equivariant Diffusion
- DiffCSP++: Space Group Constrained Crystal Generation
- SyMat: Towards Symmetry-Aware Generation of Periodic Materials
- MatterGen: A Generative Model for Inorganic Materials Design
Datasets:
- FARFAR2-Puzzles——ARES: Geometric deep learning of RNA structure
Methods:
- Geometrically equivariant graph neural networks: A survey
- A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems
- A Systematic Survey of Chemical Pre-trained Models
- Graph-based Molecular Representation Learning
- Geometric Deep Learning on Molecular Representations
- Artificial intelligence for science in quantum, atomistic, and continuum systems