Releases: rsokl/MyGrad
Release MyGrad 1.9.0
v1.9.0 release 1.9.0
Release MyGrad 1.8.1
Release MyGrad 1.8.0
add ver 1.8 docs (#264) * add ver 1.8 docs * Update docs/source/random.rst Co-authored-by: Darshan Krishnaswamy <darsh797@gmail.com> * move test functions; get rid of deprecated random_integers() Co-authored-by: Darshan Krishnaswamy <darsh797@gmail.com>
Release MyGrad 1.7.1
Fixes a bug in :func:~mygrad.nnet.losses.negative_log_likelihood
, where setting constant=True
had no effect.
Release MyGrad 1.7.0
This release continues the process of integrating functions from mynn <https://github.com/davidmascharka/MyNN>
_.
New features:
- Adds :func:
~mygrad.nnet.initializers.glorot_normal
- Adds :func:
~mygrad.nnet.initializers.glorot_uniform
- Adds :func:
~mygrad.nnet.initializers.he_normal
- Adds :func:
~mygrad.nnet.initializers.he_uniform
- Adds :func:
~mygrad.nnet.initializers.normal
- Adds :func:
~mygrad.nnet.initializers.uniform
- Adds :func:
~mygrad.nnet.losses.focal_loss
- Adds :func:
~mygrad.nnet.losses.negative_log_likelihood
Big thanks to David Mascharka!
Improvements:
The interfaces to :func:~mygrad.reshape
and :func:~mygrad.Tensor.reshape
were adjusted to match exactly the interfaces to their NumPy counterparts.
I.e. :func:~mygrad.reshape
now requires newshape
to be a sequence, whereas :func:~mygrad.Tensor.reshape
can accept an unpacked sequence for its
newshape
.
:func:~mygrad.Tensor.shape
is now settable - triggering an in-place reshape of a tensor, matching the corresponding behavior in NumPy.
Internal changes:
The logic for writing an in-place operation has been consolidated into a convenient wrapper: :func:~mygrad.Tensor._in_place_op
.
Release MyGrad 1.6.0
v1.6.0 release mygrad 1.6