You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In MATLAB, the feedforwardnet supports many optimizers, and some are not implemented in PyTorch, such as trainlm (Levenberg Marquardt). From tests from our gourp and others, Levenberg Marquardt performs much better and needs less epochs to coverage than first-order optimizers (Adam, SGD with Momentum) on regression model. Some issues in PyTorch issue list hope for a implementation of Levenverg Marquardt algorithm, but there are no more reponse. It will be helpful to implement the Levenberg Marquardt optimizer and I am willing to help test.
interesting suggestion. Do you have a specific use case where first-order methods don't work for example? BackPACK certainly provides the functionality to implement such an optimizer, and I would be happy to help out in case of issues.
Feature
Implementation of Levenberg Marquardt optimizer.
Motivation
In MATLAB, the
feedforwardnet
supports many optimizers, and some are not implemented in PyTorch, such astrainlm
(Levenberg Marquardt). From tests from our gourp and others, Levenberg Marquardt performs much better and needs less epochs to coverage than first-order optimizers (Adam, SGD with Momentum) on regression model. Some issues in PyTorch issue list hope for a implementation of Levenverg Marquardt algorithm, but there are no more reponse. It will be helpful to implement the Levenberg Marquardt optimizer and I am willing to help test.Addition
Here are some helpful information and codes.
The text was updated successfully, but these errors were encountered: