Skip to content

Commit

Permalink
Merge branch 'master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
rusty1s authored Jun 23, 2022
2 parents 12387d0 + 97c50a0 commit 4a0fdc5
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 2 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
## [2.0.5] - 2022-MM-DD
### Added
- Added a `normalize` parameter to `dense_diff_pool` ([#4847](https://github.com/pyg-team/pytorch_geometric/pull/4847))
- Added `size=None` explanation to jittable `MessagePassing` modules in the documentation ([#4850](https://github.com/pyg-team/pytorch_geometric/pull/4850))
- Added documentation to the `DataLoaderIterator` class ([#4838](https://github.com/pyg-team/pytorch_geometric/pull/4838))
- Added `GraphStore` support to `Data` and `HeteroData` ([#4816](https://github.com/pyg-team/pytorch_geometric/pull/4816))
- Added `FeatureStore` support to `Data` and `HeteroData` ([#4807](https://github.com/pyg-team/pytorch_geometric/pull/4807))
Expand Down
11 changes: 9 additions & 2 deletions docs/source/notes/jit.rst
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,8 @@ However, if you want your own GNN module to be jittable, you need to account for
def forward(self, x: Tensor, edge_index: Tensor,
edge_weight: Optional[Tensor]) -> Tensor:
return self.propagate(edge_index, x=x, edge_weight=edge_weight)
return self.propagate(edge_index, x=x, edge_weight=edge_weight,
size=None)
2. Declaring the type of propagation arguments as a comment anywhere inside your module:

Expand All @@ -115,4 +116,10 @@ However, if you want your own GNN module to be jittable, you need to account for
edge_weight: Optional[Tensor]) -> Tensor:
# propagate_type: (x: Tensor, edge_weight: Optional[Tensor])
return self.propagate(edge_index, x=x, edge_weight=edge_weight)
return self.propagate(edge_index, x=x, edge_weight=edge_weight,
size=None)
.. warning::

Importantly, due to TorchScript limitations, one also has to pass in the :obj:`size` attribute to :meth:`~torch_geometric.nn.conv.message_passing.MessagePassing.propagate`.
In most cases, this can be simply set to :obj:`None` in which case it will be automatically inferred.

0 comments on commit 4a0fdc5

Please sign in to comment.