diff --git a/GNNGraphs/docs/src/api/temporalgraph.md b/GNNGraphs/docs/src/api/temporalgraph.md index f446adf04..c9992e7c4 100644 --- a/GNNGraphs/docs/src/api/temporalgraph.md +++ b/GNNGraphs/docs/src/api/temporalgraph.md @@ -10,7 +10,7 @@ Pages = ["temporalsnapshotsgnngraph.jl"] Private = false ``` -### TemporalSnapshotsGNNGraph random generators +## TemporalSnapshotsGNNGraph random generators ```@docs rand_temporal_radius_graph diff --git a/GNNGraphs/docs/src/index.md b/GNNGraphs/docs/src/index.md index fc64196cb..6bc153228 100644 --- a/GNNGraphs/docs/src/index.md +++ b/GNNGraphs/docs/src/index.md @@ -1,6 +1,6 @@ # GNNGraphs.jl -GNNGraphs.jl is a package that provides graph data structures and helper functions specifically designed for working with graph neural networks. This package allows to store not only the graph structure, but also features associated with nodes, edges, and the graph itself. It is the core foundation for the GNNlib, GraphNeuralNetworks, and GNNLux packages. +GNNGraphs.jl is a package that provides graph data structures and helper functions specifically designed for working with graph neural networks. This package allows to store not only the graph structure, but also features associated with nodes, edges, and the graph itself. It is the core foundation for the GNNlib.jl, GraphNeuralNetworks.jl, and GNNLux.jl packages. It supports three types of graphs: @@ -12,4 +12,16 @@ It supports three types of graphs: -This package depends on the package [Graphs.jl] (https://github.com/JuliaGraphs/Graphs.jl). \ No newline at end of file +This package depends on the package [Graphs.jl] (https://github.com/JuliaGraphs/Graphs.jl). + + + +## Installation + +The package can be installed with the Julia package manager. +From the Julia REPL, type `]` to enter the Pkg REPL mode and run: + +```julia +pkg> add GNNGraphs +``` + diff --git a/GNNLux/docs/Project.toml b/GNNLux/docs/Project.toml index dbb31551d..97644b929 100644 --- a/GNNLux/docs/Project.toml +++ b/GNNLux/docs/Project.toml @@ -1,5 +1,6 @@ [deps] Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4" +DocumenterInterLinks = "d12716ef-a0f6-4df4-a9f1-a5a34e75c656" GNNLux = "e8545f4d-a905-48ac-a8c4-ca114b98986d" GNNlib = "a6a84749-d869-43f8-aacc-be26a1996e48" LiveServer = "16fef848-5104-11e9-1b77-fb7a48bbb589" diff --git a/GNNLux/docs/make.jl b/GNNLux/docs/make.jl index 0914a3f9f..cb7e1e8b3 100644 --- a/GNNLux/docs/make.jl +++ b/GNNLux/docs/make.jl @@ -1,4 +1,5 @@ using Documenter +using DocumenterInterLinks using GNNlib using GNNLux @@ -8,11 +9,15 @@ assets=[] prettyurls = get(ENV, "CI", nothing) == "true" mathengine = MathJax3() - +interlinks = InterLinks( + "GNNGraphs" => ("https://carlolucibello.github.io/GraphNeuralNetworks.jl/GNNGraphs/", joinpath(dirname(dirname(@__DIR__)), "GNNGraphs", "docs", "build", "objects.inv")), + "GNNlib" => ("https://carlolucibello.github.io/GraphNeuralNetworks.jl/GNNlib/", joinpath(dirname(dirname(@__DIR__)), "GNNlib", "docs", "build", "objects.inv"))) + makedocs(; modules = [GNNLux], doctest = false, clean = true, + plugins = [interlinks], format = Documenter.HTML(; mathengine, prettyurls, assets = assets, size_threshold=nothing), sitename = "GNNLux.jl", pages = ["Home" => "index.md", diff --git a/GNNLux/docs/src/api/basic.md b/GNNLux/docs/src/api/basic.md index 2242745d6..aef66c50a 100644 --- a/GNNLux/docs/src/api/basic.md +++ b/GNNLux/docs/src/api/basic.md @@ -4,5 +4,6 @@ CurrentModule = GNNLux ## GNNLayer ```@docs -GNNLux.GNNLayer +GNNLayer +GNNChain ``` \ No newline at end of file diff --git a/GNNLux/docs/src/api/conv.md b/GNNLux/docs/src/api/conv.md new file mode 100644 index 000000000..89114c82c --- /dev/null +++ b/GNNLux/docs/src/api/conv.md @@ -0,0 +1,27 @@ +```@meta +CurrentModule = GNNLux +``` + +# Convolutional Layers + +Many different types of graphs convolutional layers have been proposed in the literature. Choosing the right layer for your application could involve a lot of exploration. +Multiple graph convolutional layers are typically stacked together to create a graph neural network model (see [`GNNChain`](@ref)). + +The table below lists all graph convolutional layers implemented in the *GNNLux.jl*. It also highlights the presence of some additional capabilities with respect to basic message passing: +- *Sparse Ops*: implements message passing as multiplication by sparse adjacency matrix instead of the gather/scatter mechanism. This can lead to better CPU performances but it is not supported on GPU yet. +- *Edge Weight*: supports scalar weights (or equivalently scalar features) on edges. +- *Edge Features*: supports feature vectors on edges. +- *Heterograph*: supports heterogeneous graphs (see [`GNNHeteroGraph`](@ref)). +- *TemporalSnapshotsGNNGraphs*: supports temporal graphs (see [`TemporalSnapshotsGNNGraph`](@ref)) by applying the convolution layers to each snapshot independently. + +| Layer |Sparse Ops|Edge Weight|Edge Features| Heterograph | TemporalSnapshotsGNNGraphs | +| :-------- | :---: |:---: |:---: | :---: | :---: | ✓ | +| [`GCNConv`](@ref) | ✓ | ✓ | | ✓ | | + +## Docs + +```@autodocs +Modules = [GNNLux] +Pages = ["layers/conv.jl"] +Private = false +``` \ No newline at end of file diff --git a/GNNLux/src/layers/basic.jl b/GNNLux/src/layers/basic.jl index 6b4763459..ac28cabb7 100644 --- a/GNNLux/src/layers/basic.jl +++ b/GNNLux/src/layers/basic.jl @@ -2,14 +2,52 @@ abstract type GNNLayer <: AbstractLuxLayer end An abstract type from which graph neural network layers are derived. -It is Derived from Lux's `AbstractLuxLayer` type. +It is derived from Lux's `AbstractLuxLayer` type. -See also `GNNChain`. +See also [`GNNLux.GNNChain`](@ref). """ abstract type GNNLayer <: AbstractLuxLayer end abstract type GNNContainerLayer{T} <: AbstractLuxContainerLayer{T} end +""" + GNNChain(layers...) + GNNChain(name = layer, ...) + +Collects multiple layers / functions to be called in sequence +on given input graph and input node features. + +It allows to compose layers in a sequential fashion as `Lux.Chain` +does, propagating the output of each layer to the next one. +In addition, `GNNChain` handles the input graph as well, providing it +as a first argument only to layers subtyping the [`GNNLayer`](@ref) abstract type. + +`GNNChain` supports indexing and slicing, `m[2]` or `m[1:end-1]`, +and if names are given, `m[:name] == m[1]` etc. + +# Examples +```jldoctest +julia> using Lux, GNNLux, Random + +julia> rng = Random.default_rng(); + +julia> m = GNNChain(GCNConv(2=>5), + x -> relu.(x), + Dense(5=>4)) + +julia> x = randn(rng, Float32, 2, 3); + +julia> g = rand_graph(rng, 3, 6) +GNNGraph: + num_nodes: 3 + num_edges: 6 + +julia> ps, st = LuxCore.setup(rng, m); + +julia> m(g, x, ps, st) # First entry is the output, second entry is the state of the model +(Float32[-0.15594329 -0.15594329 -0.15594329; 0.93431795 0.93431795 0.93431795; 0.27568763 0.27568763 0.27568763; 0.12568939 0.12568939 0.12568939], (layer_1 = NamedTuple(), layer_2 = NamedTuple(), layer_3 = NamedTuple())) +``` +""" @concrete struct GNNChain <: GNNContainerLayer{(:layers,)} layers <: NamedTuple end diff --git a/GNNLux/src/layers/conv.jl b/GNNLux/src/layers/conv.jl index f0b51066b..3603852f5 100644 --- a/GNNLux/src/layers/conv.jl +++ b/GNNLux/src/layers/conv.jl @@ -5,7 +5,80 @@ _getstate(s::StatefulLuxLayer{Static.True}) = s.st _getstate(s::StatefulLuxLayer{false}) = s.st_any _getstate(s::StatefulLuxLayer{Static.False}) = s.st_any - +@doc raw""" + GCNConv(in => out, σ=identity; [init_weight, init_bias, use_bias, add_self_loops, use_edge_weight]) + +Graph convolutional layer from paper [Semi-supervised Classification with Graph Convolutional Networks](https://arxiv.org/abs/1609.02907). + +Performs the operation +```math +\mathbf{x}'_i = \sum_{j\in N(i)} a_{ij} W \mathbf{x}_j +``` +where ``a_{ij} = 1 / \sqrt{|N(i)||N(j)|}`` is a normalization factor computed from the node degrees. + +If the input graph has weighted edges and `use_edge_weight=true`, than ``a_{ij}`` will be computed as +```math +a_{ij} = \frac{e_{j\to i}}{\sqrt{\sum_{j \in N(i)} e_{j\to i}} \sqrt{\sum_{i \in N(j)} e_{i\to j}}} +``` + +# Arguments + +- `in`: Number of input features. +- `out`: Number of output features. +- `σ`: Activation function. Default `identity`. +- `init_weight`: Weights' initializer. Default `glorot_uniform`. +- `init_bias`: Bias initializer. Default `zeros32`. +- `use_bias`: Add learnable bias. Default `true`. +- `add_self_loops`: Add self loops to the graph before performing the convolution. Default `false`. +- `use_edge_weight`: If `true`, consider the edge weights in the input graph (if available). + If `add_self_loops=true` the new weights will be set to 1. + This option is ignored if the `edge_weight` is explicitly provided in the forward pass. + Default `false`. + +# Forward + + (::GCNConv)(g, x, [edge_weight], ps, st; norm_fn = d -> 1 ./ sqrt.(d), conv_weight=nothing) + +Takes as input a graph `g`, a node feature matrix `x` of size `[in, num_nodes]`, optionally an edge weight vector and the parameter and state of the layer. Returns a node feature matrix of size +`[out, num_nodes]`. + +The `norm_fn` parameter allows for custom normalization of the graph convolution operation by passing a function as argument. +By default, it computes ``\frac{1}{\sqrt{d}}`` i.e the inverse square root of the degree (`d`) of each node in the graph. +If `conv_weight` is an `AbstractMatrix` of size `[out, in]`, then the convolution is performed using that weight matrix. + +# Examples + +```julia +using GNNLux, Lux, Random +# initialize random number generator +rng = Random.default_rng() +# create data +s = [1,1,2,3] +t = [2,3,1,1] +g = GNNGraph(s, t) +x = randn(rng, Float32, 3, g.num_nodes) + +# create layer +l = GCNConv(3 => 5) + +# setup layer +ps, st = LuxCore.setup(rng, l) + +# forward pass +y = l(g, x, ps, st) # size of the output first entry: 5 × num_nodes + +# convolution with edge weights and custom normalization function +w = [1.1, 0.1, 2.3, 0.5] +custom_norm_fn(d) = 1 ./ sqrt.(d + 1) # Custom normalization function +y = l(g, x, w, ps, st; norm_fn = custom_norm_fn) + +# Edge weights can also be embedded in the graph. +g = GNNGraph(s, t, w) +l = GCNConv(3 => 5, use_edge_weight=true) +ps, st = Lux.setup(rng, l) +y = l(g, x, ps, st) # same as l(g, x, w) +``` +""" @concrete struct GCNConv <: GNNLayer in_dims::Int out_dims::Int @@ -18,7 +91,7 @@ _getstate(s::StatefulLuxLayer{Static.False}) = s.st_any end function GCNConv(ch::Pair{Int, Int}, σ = identity; - init_weight = glorot_uniform, + init_weight = glorot_uniform, init_bias = zeros32, use_bias::Bool = true, add_self_loops::Bool = true, @@ -55,7 +128,7 @@ end function (l::GCNConv)(g, x, edge_weight, ps, st; norm_fn = d -> 1 ./ sqrt.(d), - conv_weight=nothing, ) + conv_weight=nothing) m = (; ps.weight, bias = _getbias(ps), l.add_self_loops, l.use_edge_weight, l.σ) diff --git a/GNNlib/docs/src/index.md b/GNNlib/docs/src/index.md index d1668b933..cc27aa266 100644 --- a/GNNlib/docs/src/index.md +++ b/GNNlib/docs/src/index.md @@ -1,6 +1,15 @@ # GNNlib.jl GNNlib.jl is a package that provides the implementation of the basic message passing functions and -functional implementation of graph convolutional layers, which are used to build graph neural networks in both the Flux.jl and Lux.jl machine learning frameworks, created in the GraphNeuralNetworks.jl and GNNLux.jl packages, respectively. +functional implementation of graph convolutional layers, which are used to build graph neural networks in both the [Flux.jl](https://fluxml.ai/Flux.jl/stable/) and [Lux.jl](https://lux.csail.mit.edu/stable/) machine learning frameworks, created in the GraphNeuralNetworks.jl and GNNLux.jl packages, respectively. -This package depends on GNNGraphs.jl and NNlib.jl, and is primarily intended for developers looking to create new GNN architectures. For most users, the higher-level GraphNeuralNetworks.jl and GNNLux.jl packages are recommended. \ No newline at end of file +This package depends on GNNGraphs.jl and NNlib.jl, and is primarily intended for developers looking to create new GNN architectures. For most users, the higher-level GraphNeuralNetworks.jl and GNNLux.jl packages are recommended. + +## Installation + +The package can be installed with the Julia package manager. +From the Julia REPL, type `]` to enter the Pkg REPL mode and run: + +```julia +pkg> add GNNlib +``` diff --git a/GNNlib/docs/src/messagepassing.md b/GNNlib/docs/src/messagepassing.md index 954fb9dd2..776cc0200 100644 --- a/GNNlib/docs/src/messagepassing.md +++ b/GNNlib/docs/src/messagepassing.md @@ -134,7 +134,7 @@ function (l::GCN)(g::GNNGraph, x::AbstractMatrix{T}) where T end ``` -See the `GATConv` implementation [here](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl) for a more complex example. +See the `GATConv` implementation [here](https://juliagraphs.org/GraphNeuralNetworks.jl/graphneuralnetworks/api/conv/) for a more complex example. ## Built-in message functions diff --git a/GNNlib/src/GNNlib.jl b/GNNlib/src/GNNlib.jl index 3ca4acc9b..1941f5752 100644 --- a/GNNlib/src/GNNlib.jl +++ b/GNNlib/src/GNNlib.jl @@ -2,7 +2,7 @@ module GNNlib using Statistics: mean using LinearAlgebra, Random -using MLUtils: zeros_like +using MLUtils: zeros_like, ones_like using NNlib using NNlib: scatter, gather using DataStructures: nlargest diff --git a/GraphNeuralNetworks/docs/src/home.md b/GraphNeuralNetworks/docs/src/home.md index 2ccebefd0..fa28621a7 100644 --- a/GraphNeuralNetworks/docs/src/home.md +++ b/GraphNeuralNetworks/docs/src/home.md @@ -1,8 +1,8 @@ # GraphNeuralNetworks -This is the documentation page for [GraphNeuralNetworks.jl](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl), a graph neural network library written in Julia and based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl). -GraphNeuralNetworks.jl is largely inspired by [PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/), [Deep Graph Library](https://docs.dgl.ai/), -and [GeometricFlux.jl](https://fluxml.ai/GeometricFlux.jl/stable/). +GraphNeuralNetworks.jl is a graph neural network package based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl). + +It provides a set of stateful graph convolutional layers and utilities to build graph neural networks. Among its features: @@ -11,30 +11,30 @@ Among its features: * Easy to define custom layers. * CUDA support. * Integration with [Graphs.jl](https://github.com/JuliaGraphs/Graphs.jl). -* [Examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/examples) of node, edge, and graph level machine learning tasks. +* [Examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) of node, edge, and graph level machine learning tasks. +* Heterogeneous and temporal graphs. ## Package overview -Let's give a brief overview of the package by solving a -graph regression problem with synthetic data. +Let's give a brief overview of the package by solving a graph regression problem with synthetic data. -Usage examples on real datasets can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/examples) folder. +Usage examples on real datasets can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) folder. ### Data preparation We create a dataset consisting in multiple random graphs and associated data features. ```julia -using GraphNeuralNetworks, Graphs, Flux, CUDA, Statistics, MLUtils +using GraphNeuralNetworks, Flux, CUDA, Statistics, MLUtils using Flux: DataLoader all_graphs = GNNGraph[] for _ in 1:1000 g = rand_graph(10, 40, - ndata=(; x = randn(Float32, 16,10)), # input node features - gdata=(; y = randn(Float32))) # regression target + ndata=(; x = randn(Float32, 16,10)), # Input node features + gdata=(; y = randn(Float32))) # Regression target push!(all_graphs, g) end ``` @@ -50,7 +50,7 @@ model = GNNChain(GCNConv(16 => 64), BatchNorm(64), # Apply batch normalization on node features (nodes dimension is batch dimension) x -> relu.(x), GCNConv(64 => 64, relu), - GlobalPool(mean), # aggregate node-wise features into graph-wise features + GlobalPool(mean), # Aggregate node-wise features into graph-wise features Dense(64, 1)) |> device opt = Flux.setup(Adam(1f-4), model) diff --git a/GraphNeuralNetworks/docs/src/index.md b/GraphNeuralNetworks/docs/src/index.md index a8a8b0f1d..692347aea 100644 --- a/GraphNeuralNetworks/docs/src/index.md +++ b/GraphNeuralNetworks/docs/src/index.md @@ -15,6 +15,49 @@ Here is a schema of the dependencies between the packages: ![Monorepo schema](assets/schema.png) +Among its general features: + +* Implements common graph convolutional layers both in stateful and stateless form. +* Supports computations on batched graphs. +* Easy to define custom layers. +* CUDA support. +* Integration with [Graphs.jl](https://github.com/JuliaGraphs/Graphs.jl). +* [Examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) of node, edge, and graph level machine learning tasks. +* Heterogeneous and temporal graphs. + +## Installation + +GraphNeuralNetworks.jl, GNNlib.jl and GNNGraphs.jl are a registered Julia packages. You can easily install a package, for example GraphNeuralNetworks.jl, through the package manager : + +```julia +pkg> add GraphNeuralNetworks +``` + +## Usage + +Usage examples can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) and in the [notebooks](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/notebooks) folder. Also, make sure to read the [documentation](https://juliagraphs.org/GraphNeuralNetworks.jl/graphneuralnetworks/) for a comprehensive introduction to the library and the [tutorials](https://juliagraphs.org/GraphNeuralNetworks.jl/tutorials/). + + +## Citing + +If you use GraphNeuralNetworks.jl in a scientific publication, we would appreciate the following reference: + +``` +@misc{Lucibello2021GNN, + author = {Carlo Lucibello and other contributors}, + title = {GraphNeuralNetworks.jl: a geometric deep learning library for the Julia programming language}, + year = 2021, + url = {https://github.com/JuliaGraphs/GraphNeuralNetworks.jl} +} +``` + +## Acknowledgments + +GraphNeuralNetworks.jl is largely inspired by [PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/), [Deep Graph Library](https://docs.dgl.ai/), +and [GeometricFlux.jl](https://fluxml.ai/GeometricFlux.jl/stable/). + + + diff --git a/README.md b/README.md index a022117c3..acbccf28b 100644 --- a/README.md +++ b/README.md @@ -7,30 +7,41 @@ ![](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/actions/workflows/ci.yml/badge.svg) [![codecov](https://codecov.io/gh/JuliaGraphs/GraphNeuralNetworks.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/JuliaGraphs/GraphNeuralNetworks.jl) +This is the monorepository for the GraphNeuralNetworks project, bringing together all code into a unified structure to facilitate code sharing and reusability across different project components. It contains the following packages: -GraphNeuralNetworks.jl is a graph neural network library written in Julia and based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl). +- `GraphNeuralNetwork.jl`: Package that contains stateful graph convolutional layers based on the machine learning framework [Flux.jl](https://fluxml.ai/Flux.jl/stable/). This is the fronted package for Flux users. It depends on GNNlib.jl, GNNGraphs.jl, and Flux.jl packages. -Among its features: +- `GNNLux.jl`: Package that contains stateless graph convolutional layers based on the machine learning framework [Lux.jl](https://lux.csail.mit.edu/stable/). This is fronted package for Lux users. It depends on GNNlib.jl, GNNGraphs.jl, and Lux.jl packages. -* Implements common graph convolutional layers. +- `GNNlib.jl`: Package that contains the core graph neural network layers and utilities. It depends on GNNGraphs.jl and GNNlib.jl packages and serves for code base for GraphNeuralNetwork.jl and GNNLux.jl packages. + +- `GNNGraphs.jl`: Package that contains the graph data structures and helper functions for working with graph data. It depends on Graphs.jl package. + + + +Among its general features: + +* Implements common graph convolutional layers both in stateful and stateless form. * Supports computations on batched graphs. * Easy to define custom layers. * CUDA support. * Integration with [Graphs.jl](https://github.com/JuliaGraphs/Graphs.jl). -* [Examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/examples) of node, edge, and graph level machine learning tasks. +* [Examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) of node, edge, and graph level machine learning tasks. * Heterogeneous and temporal graphs. ## Installation -GraphNeuralNetworks.jl is a registered Julia package. You can easily install it through the package manager: +GraphNeuralNetworks.jl, GNNlib.jl and GNNGraphs.jl are a registered Julia packages. You can easily install a package, for example GraphNeuralNetworks.jl, through the package manager : ```julia pkg> add GraphNeuralNetworks ``` + + ## Usage -Usage examples can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/examples) and in the [notebooks](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/notebooks) folder. Also, make sure to read the [documentation](https://JuliaGraphs.github.io/GraphNeuralNetworks.jl/dev) for a comprehensive introduction to the library. +Usage examples can be found in the [examples](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/examples) and in the [notebooks](https://github.com/JuliaGraphs/GraphNeuralNetworks.jl/tree/master/GraphNeuralNetworks/notebooks) folder. Also, make sure to read the [documentation](https://juliagraphs.org/GraphNeuralNetworks.jl/graphneuralnetworks/) for a comprehensive introduction to the library and the [tutorials](https://juliagraphs.org/GraphNeuralNetworks.jl/tutorials/). ## Citing