This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
-
Updated
Jul 1, 2024 - Python
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
Implementation of PSGD optimizer in JAX
JAX implementations of various deep reinforcement learning algorithms.
Tensor Networks for Machine Learning
JAX/Flax implementation of finite-size scaling
Goal-conditioned reinforcement learning like 🔥
Training methodologies for autoregressive neural operators/emulators in JAX.
An implementation of adan optimizer for optax
A Simplistic trainer for Flax
H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh.
JAX implementation of Classical and Quantum Algorithms for Orthogonal Neural Networks by (Kerenidis et al., 2021)
Variational Graph Autoencoder implemented using Jax & Jraph
An Optax-based JAX implementation of the IVON optimizer for large-scale VI training of NNs (ICML'24 spotlight)
dm-haiku implementation of hyperbolic neural networks
Direct port of TD3_BC to JAX using Haiku and optax.
The (unofficial) vanilla version of WaveRNN
Code for Celo: Training Versatile Learned Optimizers on a Compute Diet
Add a description, image, and links to the optax topic page so that developers can more easily learn about it.
To associate your repository with the optax topic, visit your repo's landing page and select "manage topics."