Skip to content

Releases: HesitantlyHuman/autoclip

AutoClip Patch v0.2.1

29 Sep 02:25
Compare
Choose a tag to compare

AutoClip v0.2.1 Patch Notes

Patch v0.2.1 fixes minor issues concerning the optimizer wrapping behaviour.

  • Fixed an error which occurred when pickling the OptimizerWithClipping class generated from the pytorch wrapper API.
  • Added support for torch hooks for the OptimizerWithClipping class, to allow more seamless integration into existing code.
  • Added support for returning the loss results from an OptimizerWithClipping step when using a closure
  • Minor updates to docstrings of the OptimizerWithClipping class

AutoClip 0.2: Optimizer Wrappers, Pytests and Parameter Checking

16 Jul 05:59
Compare
Choose a tag to compare

AutoClip 0.2: Release Notes

Only the PyTorch API is updated in this version.

PyTorch API

There are three main improvements to the PyTorch AutoClip implementation. They are as follows:

  • New Optimizer Wrapping Pattern
  • Parameter Value and Type Checking
  • Testing with Pytest

Optimizer Wrapping

There were a couple of cases where the previous AutoClip API could not be used, primarily when the training code was imported from an outside library. To address this friction, the optimizer wrapping pattern was added, along with the new OptimizerWithClipper class. This class is a true wrapper for the optimizer in question, meaning it will function almost identically, even if the optimizer has special fields and functions. This allows us to shim in clipping behavior anywhere we create an optimizer. A marked improvement!

Parameter Value and Type Checking

Clipper classes now must implement the verify_parameter_settings function. This function will be called during __init__ for the clipper defaults, and during add_param_group. This should help give users much more accurate and useful error messages about invalid parameter inputs.

Testing

We now have tests! Hate to write them, love to have them. Not much more to say about this other than the reliability this should impose.

AutoClip 0.1: PyTorch API, Checkpointing and Local Clipping.

09 Jul 21:27
Compare
Choose a tag to compare

AutoClip 0.1 Release Notes

First release of AutoClip as a package! The package is now available to install from pypi, and base features are all completed.

Features

PyTorch API

The torch implementation of AutoClip has received a lot of attention due to some contributor preferences *ahem*. As such it has a host of features beyond the basic implementation of the AutoClip algorithm.

Additional Clipping Method

In addition to the standard QuantileClip method, there is also StandardClip method, which will clip based on the standard deviation of previous gradients. Whether this is a beneficial or meaningful difference has yet to be seen, as no significant testing has been done.

Local Clipping

The torch clippers (both QuantileClip and StandardClip) support both global and local clipping modes. This allows for parameter level clipping adaptation if desired.

Checkpointing

Torch clippers also now support checkpointing, using the pytorch state_dict system. This is very important for using AutoClip in any sort of at scale training environment.

Tensorflow API

Only minor changes have been made to the tensorflow API. These aim to maintain the same parameter and naming schemes between the two versions. As such, the tensorflow implementation does not have feature parity with the pytorch implementation.