Installation • Documentation • Quickstart • Features • Background • Motivation • Citation
Exponax
is a suite for building Fourier spectral ETDRK time-steppers for
semi-linear PDEs in 1d, 2d, and 3d. There are many pre-built dynamics and plenty
of helpful utilities. It is extremely efficient, is differentiable (due to being
fully written in JAX), and embeds seamlessly into deep learning.
pip install exponax
Requires Python 3.10+ and JAX 0.4.13+. 👉 JAX install guide.
Documentation is available at fkoehler.site/exponax.
1d Kuramoto-Sivashinsky Equation.
import jax
import exponax as ex
import matplotlib.pyplot as plt
ks_stepper = ex.stepper.KuramotoSivashinskyConservative(
num_spatial_dims=1, domain_extent=100.0,
num_points=200, dt=0.1,
)
u_0 = ex.ic.RandomTruncatedFourierSeries(
num_spatial_dims=1, cutoff=5
)(num_points=200, key=jax.random.PRNGKey(0))
trajectory = ex.rollout(ks_stepper, 500, include_init=True)(u_0)
plt.imshow(trajectory[:, 0, :].T, aspect='auto', cmap='RdBu', vmin=-2, vmax=2, origin="lower")
plt.xlabel("Time"); plt.ylabel("Space"); plt.show()
For a next step, check out this tutorial on 1D
Advection
that explains the basics of Exponax
.
- JAX as the computational backend:
- Backend agnotistic code - run on CPU, GPU, or TPU, in both single and double precision.
- Automatic differentiation over the timesteppers - compute gradients of solutions with respect to initial conditions, parameters, etc.
- Also helpful for tight integration with Deep Learning since each timestepper is just an Equinox Module.
- Automatic Vectorization using
jax.vmap
(orequinox.filter_vmap
) allowing to advance multiple states in time or instantiate multiple solvers at a time that operate efficiently in batch.
- Lightweight Design without custom types. There is no
grid
orstate
object. Everything is based on JAX arrays. Timesteppers are callable PyTrees. - More than 46 pre-built dynamics across 1d, 2d, and 3d:
- Linear PDEs (advection, diffusion, dispersion, etc.)
- Nonlinear PDEs (Burgers, Kuramoto-Sivashinsky, Korteweg-de Vries, Navier-Stokes, etc.)
- Reaction-Diffusion (Gray-Scott, Swift-Hohenberg, etc.)
- Collection of initial condition distributions (truncated Fourier series, Gaussian Random Fields, etc.)
- Utilities for spectral derivatives, grid creation, autogressive rollout, interpolation, etc.
- Easily extendable to new PDEs by subclassing from the
BaseStepper
module. - An alternative, reduced interface allowing to define PDE dynamics using normalized or difficulty-based idenfitiers.
Exponax supports the efficient solution of (semi-linear) partial differential equations on periodic domains in arbitrary dimensions. Those are PDEs of the form
where
We focus on periodic domains on scaled hypercubes with a uniform Cartesian discretization. This allows using the Fast Fourier Transform resulting in blazing fast simulations. For example, a dataset of trajectories for the 2d Kuramoto-Sivashinsky equation with 50 initial conditions over 200 time steps with a 128x128 discretization is created in less than a second on a modern GPU.
[1] Cox, Steven M., and Paul C. Matthews. "Exponential time differencing for stiff systems." Journal of Computational Physics 176.2 (2002): 430-455.
[2] Kassam, A.K. and Trefethen, L.N., 2005. Fourth-order time-stepping for stiff PDEs. SIAM Journal on Scientific Computing, 26(4), pp.1214-1233.
[3] Montanelli, Hadrien, and Niall Bootland. "Solving periodic semilinear stiff PDEs in 1D, 2D and 3D with exponential integrators." Mathematics and Computers in Simulation 178 (2020): 307-327.
This package is greatly inspired by the chebfun
library in MATLAB, in particular the
spinX
(Stiff Pde INtegrator
in X dimensions) module within it. These MATLAB utilties have been used
extensively as a data generator in early works for supervised physics-informed
ML, e.g., the
DeepHiddenPhysics
and Fourier Neural
Operators
(the links show where in their public repos they use the spinX
module). The
approach of pre-sampling the solvers, writing out the trajectories, and then
using them for supervised training worked for these problems, but of course
limits the scope to purely supervised problem. Modern research ideas like
correcting coarse solvers (see for instance the Solver-in-the-Loop
paper or the ML-accelerated CFD
paper) require a coarse solvers to be
differentiable. Some ideas
of diverted chain training also requires the fine solver to be differentiable.
Even for applications without differentiable solvers, we still have the
interface problem with legacy solvers (like the MATLAB ones). Hence, we
cannot easily query them "on-the-fly" for sth like active learning tasks, nor do
they run efficiently on hardward accelerators (GPUs, TPUs, etc.). Additionally,
they were not designed with batch execution (in the sense of vectorized
application) in mind which we get more or less for free by jax.vmap
. With the
reproducible randomness of JAX
we might not even have to ever write out a
dataset and can re-create it in seconds!
This package also took much inspiration from the FourierFlows.jl in the Julia ecosystem, especially for checking the implementation of the contour integral method of [2] and how to handle (de)aliasing.
This package was developed as part of the APEBench paper (arxiv.org/abs/2411.00180) (accepted at Neurips 2024). If you find it useful for your research, please consider citing it:
@article{koehler2024apebench,
title={{APEBench}: A Benchmark for Autoregressive Neural Emulators of {PDE}s},
author={Felix Koehler and Simon Niedermayr and R{\"}udiger Westermann and Nils Thuerey},
journal={Advances in Neural Information Processing Systems (NeurIPS)},
volume={38},
year={2024}
}
(Feel free to also give the project a star on GitHub if you like it.)
Here you can find the APEBench benchmark suite.
The main author (Felix Koehler) is a PhD student in the group of Prof. Thuerey at TUM and his research is funded by the Munich Center for Machine Learning.
MIT, see here
fkoehler.site · GitHub @ceyron · X @felix_m_koehler · LinkedIn Felix Köhler