Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rewrite draw_segmentation_masks and update gallery example to illustrate both instance and semantic segmentation models #3824
Rewrite draw_segmentation_masks and update gallery example to illustrate both instance and semantic segmentation models #3824
Changes from 19 commits
53cd516
9cf6247
d7de81e
c598c1f
e52fb08
07b6147
131f1a4
7876d47
e0e0fc6
5d287db
f91d070
2ccd82b
a59c9d7
d77f0e5
079feef
efb12bc
b0e39c1
1885d95
5d53e9d
3a61f83
99a0ebf
b334148
e137c1c
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you instead do
torch.nn.functional.softmax(masks, dim=0)
or maybemasks.softmax(dim=0)
?Actually, you can probably just do
no?
I think it's an anti-pattern to instantiate the
nn.Module
just to perform this operation onceThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I'll use the functional API.
normalized_masks = output.softmax(dim=1)
I think this would compute the softmax across the batch as well (dimension 0), and we wouldn't end up with values that sum to 1 for each image independently, the values would sum to 1 across the entire batch. That's actually something I wanted to double check with you, WDYT?Edit: Hm they do seem to be the same. I'll use the simpler version as suggested then
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should do the expected thing if we use
output.softmax(dim=1)
, but it's a good question.We use it already in the loss for the segmentation models, which internally is handled by passing
dim=1
to (log)softmax.Internally, the softmax implementation splits the dimensions into those that are "batch dimensions" (which comes before the
dim
) and the "to be softmax" dimensions (which comes after `dim), see here for more details