Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

implement a softmax version of focal loss #6510

Closed
wyli opened this issue May 11, 2023 · 1 comment · Fixed by #6544
Closed

implement a softmax version of focal loss #6510

wyli opened this issue May 11, 2023 · 1 comment · Fixed by #6544

Comments

@wyli
Copy link
Contributor

wyli commented May 11, 2023

Is your feature request related to a problem? Please describe.

The softmax version of focal loss is:

  FL(p_t) = -alpha * (1 - p_t)**gamma * log(p_t),

where p_i = exp(s_i) / sum_j exp(s_j), t is the target (ground truth) class, and
s_j is the unnormalized score for class j.

cf https://github.com/pytorch/pytorch/blob/main/modules/detectron/softmax_focal_loss_op.cc#L38-L43

the current implementation is hard-coded to use sigmoid

class FocalLoss(_Loss):

@qingpeng9802
Copy link
Contributor

I can work on this issue and PR later if no one is working on this.

wyli pushed a commit that referenced this issue May 27, 2023
Fixes #6510 .

### Description

Add softmax version to Focal loss

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [ ] Non-breaking change (fix or new feature that would not break
existing functionality).
- [x] Breaking change (fix or new feature that would cause existing
functionality to change).
- [x] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [ ] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [x] In-line docstrings updated.
- [ ] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Qingpeng Li <qingpeng9802@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants