Skip to content

Folax (Finite Operator Learning with JAX) is a framework for solving and optimizing PDEs by integrating machine learning with numerical methods in computational mechanics.

License

Notifications You must be signed in to change notification settings

RezaNajian/folax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

License CI PyPI version

Folax: Solution and Optimization of parameterized PDEs

Finite Operator Learning (FOL) with JAX constitutes a unified numerical framework that seamlessly integrates established numerical methods with advanced scientific machine learning techniques for solving and optimizing parametrized partial differential equations (PDEs). In constructing a physics-informed operator learning approach, FOL formulates a purely physics-based loss function derived from the Method of Weighted Residuals, allowing discrete residuals—computed using classical PDE solution techniques—to be directly incorporated into backpropagation during network training. This approach ensures that the learned operators rigorously satisfy the underlying governing equations while maintaining consistency with established numerical discretizations. Importantly, this loss formulation is agnostic to the network architecture and has been successfully applied to architectures such as Conditional Neural Fields, Fourier Neural Operators (FNO), and DeepONets.

FOL has been applied in the following scientific studies:

  • A Physics-Informed Meta-Learning Framework for the Continuous Solution of Parametric PDEs on Arbitrary Geometries [arXiv].
  • Finite Operator Learning: Bridging Neural Operators and Numerical Methods for Efficient Parametric Solution and Optimization of PDEs [arXiv].
  • Digitalizing metallic materials from image segmentation to multiscale solutions via physics informed operator learning [npj Computational Materials].
  • A Finite Operator Learning Technique for Mapping the Elastic Properties of Microstructures to Their Mechanical Deformations [Numerical Methods in Eng.].
  • SPiFOL: A Spectral-based physics-informed finite operator learning for prediction of mechanical behavior of microstructures [J. Mechanics and Physics of Solids].

We built upon several widely adopted Python packages, including JAX for high-performance array computations on CPUs and GPUs, PETSc for the efficient solution of large-scale linear systems, Metis for mesh partitioning (integration forthcoming), Flax for constructing modular and flexible neural networks, Optax for applying state-of-the-art gradient-based optimization algorithms, and Orbax for efficient checkpointing and serialization. This foundation ensures scalability, computational efficiency, and ease of use in large-scale training and simulation workflows.

Installation

CPU installation

To install folax using pip (recommended) for CPU usage you can type the following command

pip install folax[cpu]

GPU installation

To install folax using pip (recommended) for GPU usage you can type the following command

pip install folax[cuda]

Developer installation

If you would like to do development in folax, please first clone the repo and in the folax folder, run the following command

pip install -e .[cuda,dev]

Contributing

If you would like to contribute to the project, please open a pull request with small changes. If you would like to see big changes in the source code, please open an issue or discussion so we can start a conversation.

About

Folax (Finite Operator Learning with JAX) is a framework for solving and optimizing PDEs by integrating machine learning with numerical methods in computational mechanics.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages