-
Notifications
You must be signed in to change notification settings - Fork 462
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add dice loss in linknet #816
Conversation
Codecov Report
@@ Coverage Diff @@
## main #816 +/- ##
==========================================
+ Coverage 96.01% 96.02% +0.01%
==========================================
Files 131 131
Lines 4941 4935 -6
==========================================
- Hits 4744 4739 -5
+ Misses 197 196 -1
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Added a few optional suggestions
inter = (prob_map[seg_mask] * seg_target[seg_mask]).sum() | ||
cardinality = (prob_map[seg_mask] + seg_target[seg_mask]).sum() | ||
dice_loss = 1 - 2 * inter / (cardinality + 1e-8) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we'll have to update this in another PR because it highly advantages classes that occupy the biggest area
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I will do it in the next PR !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To curb this, may be FL + Dice Loss could be a better option? Because, even if the area was same for the classes, since total area of combined bboxes / total area of image could be really less, it may tend to learn more negatives.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup, I meant that the baseline option would be to average the loss over classes (not mixing the contribution of each class to the loss before the last step). But your suggestion @SiddhantBahuguna is worth exploring as a follow-up and could improve results 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your suggestions, I will add focal loss in the next PR !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good!
This PR adds dice loss in the linknet loss (which is now bce_loss + dice_loss) and remove the boosted edges part of the loss since it hasn't led to successful trainings. Any feedback is welcome!