Releases
0.3.1
[0.3.1] - 2019-05-24
Added
Added cyclic learning rate finder
Added on_init callback hook to run at the end of trial init
Added callbacks for weight initialisation in torchbearer.callbacks.init
Added with_closure
trial method that allows running of custom closures
Added base_closure
function to bases that allows creation of standard training loop closures
Added ImagingCallback
class for callbacks which produce images that can be sent to tensorboard, visdom or a file
Added CachingImagingCallback
and MakeGrid
callback to make a grid of images
Added the option to give the only_if
callback decorator a function of self and state rather than just state
Added Layer-sequential unit-variance (LSUV) initialization
Added ClassAppearanceModel callback and example page for visualising CNNs
Added on_checkpoint callback decorator
Added support for PyTorch 1.1.0
Changed
No_grad
and enable_grad
decorators are now also context managers
Deprecated
Removed
Removed the fluent decorator, just use return self
Removed install dependency on torchvision
, still required for some functionality
Fixed
Fixed bug where replay errored when train or val steps were None
Fixed a bug where mock optimser wouldn't call it's closure
Fixed a bug where the notebook check raised ModuleNotFoundError when IPython not installed
Fixed a memory leak with metrics that causes issues with very long epochs
Fixed a bug with the once and once_per_epoch decorators
Fixed a bug where the test criterion wouldn't accept a function of state
Fixed a bug where type inference would not work correctly when chaining Trial
methods
Fixed a bug where checkpointers would error when they couldn't find the old checkpoint to overwrite
Fixed a bug where the 'test' label would sometimes not populate correctly in the default accuracy metric
You can’t perform that action at this time.