You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run training with precision=16, the model spits out float16 logits which I pass to F.softmax and then to metrics.Accuracy.
The type of F.softmax(logit) depends on whether precision=16 is specified or not.
Precision, recall and F1 metrics seem to be calculated correctly.
The text was updated successfully, but these errors were encountered:
🐛 Bug
To Reproduce
Expected behavior
Same as for float32:
Additional context
I run training with precision=16, the model spits out float16 logits which I pass to F.softmax and then to metrics.Accuracy.
The type of F.softmax(logit) depends on whether precision=16 is specified or not.
Precision, recall and F1 metrics seem to be calculated correctly.
The text was updated successfully, but these errors were encountered: