Torchdyn is a PyTorch library dedicated to numerical deep learning: differential equations, integral transforms, numerical methods. Maintained by DiffEqML.
Torchdyn provides utilities and layers to easily construct numerical deep learning models. For example, neural differential equations:
from torchdyn.core import NeuralODE
# your preferred torch.nn.Module here
f = nn.Sequential(nn.Conv2d(1, 32, 3),
nn.Softplus(),
nn.Conv2d(32, 1, 3)
)
nde = NeuralODE(f)
And you have a trainable model. Feel free to combine Torchdyn classes with any PyTorch modules to build composite models. We offer additional tools to build custom neural differential equation and implicit models, including a functional API for numerical methods. There is much more in Torchdyn other than NeuralODE
and NeuralSDE
classes: tutorials, a functional API to a variety of GPU-compatible numerical methods and benchmarks.
Contribute to the library with your benchmark, tasks and numerical deep learning utilities! No need to reinvent the wheel :)
Stable release:
pip install torchdyn
Alternatively, you can build a virtual dev environment for torchdyn
with poetry, following the steps outlined in Contributing
.
Check our docs for more information.
Interest in the blend of differential equations, deep learning and dynamical systems has been reignited by recent works [1,2, 3, 4]. Modern deep learning frameworks such as PyTorch, coupled with further improvements in computational resources have allowed the continuous version of neural networks, with proposals dating back to the 80s [5], to finally come to life and provide a novel perspective on classical machine learning problems.
We explore how differentiable programming can unlock the effectiveness of deep learning to accelerate progress across scientific domains, including control, fluid dynamics and in general prediction of complex dynamical systems. Conversely, we focus on models powered by numerical methods and signal processing to advance the state of AI in classical domains such as vision of natural language.
By providing a centralized, easy-to-access collection of model templates, tutorial and application notebooks, we hope to speed-up research in this area and ultimately establish neural differential equations and implicit models as an effective tool for control, system identification and general machine learning tasks.
torchdyn
leverages modern PyTorch best practices and handles training with pytorch-lightning
[6]. We build Graph Neural ODEs utilizing the Graph Neural Networks (GNNs) API of dgl
[7]. For a complete list of references, check pyproject.toml
. We offer a complete suite of ODE solvers and sensitivity methods, extending the functionality offered by torchdiffeq
[1]. We have light dependencies on torchsde
[7] and torchcde
[8].
torchdyn
contains a variety of self-contained quickstart examples / tutorials built for practitioners and researchers. Refer to the tutorial readme
torchdyn
is designed to be a community effort: we welcome all contributions of tutorials, model variants, numerical methods and applications related to continuous and implicit deep learning. We do not have specific style requirements, though we subscribe to many of Jeremy Howard's ideas.
We use poetry
to manage requirements, virtual python environment creation, and packaging. To install poetry
, refer to the docs.
To set up your dev environment, run poetry install
. In example, poetry run pytest
will then run all torchdyn
tests inside your newly created env.
poetry
does not currently offer a way to select torch
wheels based on desired cuda
and OS
, and will install a version without GPU support. For CUDA torch
wheels,
run poetry run poe autoinstall-torch-cuda
, that will automatically install PyTorch based on your CUDA configuration.
If you wish to run jupyter
notebooks within your newly created poetry environments, use poetry run ipython kernel install --user --name=torchdyn
and switch the notebook kernel.
Choosing what to work on: There is always ongoing work on new features, tests and tutorials. If you wish to work on additional features not currently WIP, feel free to reach out on Slack or via email. We'll be glad to discuss details.
If you find Torchdyn valuable for your research or applied projects:
@article{politorchdyn,
title={TorchDyn: Implicit Models and Neural Numerical Methods in PyTorch},
author={Poli, Michael and Massaroli, Stefano and Yamashita, Atsushi and Asama, Hajime and Park, Jinkyoo and Ermon, Stefano}
}