You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
BackPACK library does not support some of the modules in the model, specifically **GroupNorm**.
For this should I be creating custom extensions for the unsupported modules?
Logs when training the model.
env/lib/python3.11/site-packages/backpack/extensions/backprop_extension.py:106: UserWarning: Extension saving to grad_batch does not have an extension for Module <class 'networks.UNet'> although the module has parameters
warnings.warn(
env/lib/python3.11/site-packages/backpack/extensions/backprop_extension.py:106: UserWarning: Extension saving to grad_batch does not have an extension for Module <class 'torch.nn.modules.normalization.GroupNorm'> although the module has parameters
Thanks for the help :)
The text was updated successfully, but these errors were encountered:
I assume from your logs that you would like to support BatchGrad for nn.GroupNorm.
You can follow the instructions here to achieve that. A PR adding this to BackPACK would be really cool, too :)
I am re-implementing the enhancement of DP-SGD through the random sparsification of gradients on my UNet Model.
Here is a Debug info on extending the Model
extend(model)
.BackPACK library does not support some of the modules in the model, specifically
**GroupNorm**
.For this should I be creating custom extensions for the unsupported modules?
Logs when training the model.
Thanks for the help :)
The text was updated successfully, but these errors were encountered: