ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
-
Updated
Feb 27, 2023 - Python
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
Distributed K-FAC Preconditioner for PyTorch
An implementation of PSGD Kron second-order optimizer for PyTorch
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
This repository implements FEDL using pytorch
PyTorch implementation of the Hessian-free optimizer
Tensorflow implementation of preconditioned stochastic gradient descent
Hessian-based stochastic optimization in TensorFlow and keras
Implementation of PSGD optimizer in JAX
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
FOSI library for improving first order optimizers with second order information
Newton’s second-order optimization methods in python
The repository contains code to reproduce the experiments from our paper Error Feedback Can Accurately Compress Preconditioners available below:
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
Modular optimization library for PyTorch.
Stochastic Second-Order Methods in JAX
NG+: A new second-order optimizer for deep learning
sophia optimizer further projected towards flat areas of loss landscape
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."