Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

on-line hard mining #708

Closed
wants to merge 1 commit into from
Closed

on-line hard mining #708

wants to merge 1 commit into from

Conversation

CoinCheung
Copy link
Contributor

This is a demo of ohem. I have tested on our own dataset, and the map/mar get boosted roughly 1 point.

The original paper use nms to eliminate overlapping rois, but the original paper use ss to generate proposals, while this repository relies on rpn to generate proposals. So I did not implement associated part.

If this modification has potential risks or errors, please point them out and I will fix it.

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Apr 22, 2019
beta=1,
).sum(dim=1, keepdim=True)
ohem_loss = classification_loss
ohem_loss[sampled_pos_inds_subset[:, None]] += box_loss
Copy link

@WeihongM WeihongM May 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hpandeycodeit @pietern @sotte This line operation maybe is wrong, which should be modifed into
ohem_loss[sampled_pos_inds_subset[:, None]] = ohem_loss[sampled_pos_inds_subset[:, None]] + box_loss

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi,
Thanks for reviewing. Would you please explain why this is wrong? I just do not understand the difference.

Copy link

@WeihongM WeihongM May 28, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@CoinCheung I am not sure for this, I have tried your code above, but I find it fail to add the box_loss to the correspond classification loss at ohem. Maybe you can debug whether it's the same case.
Actually I changed the code

 ohem_loss = classification_loss.clone()
 ohem_loss[sampled_pos_inds_subset[:, None]] = ohem_loss[sampled_pos_inds_subset[:, None]] + box_loss

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would the very least not use an in-place operation, as it might break backpropagation depending on the operation.

You could replace it with something like what @WeihongM mentioned

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for reviewing this. I have modified as you suggested.

Copy link
Contributor

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks generally good, thanks!

Could you please address the comments? This is then good to merge!

@@ -361,8 +361,9 @@ def __init__(self, polygons, size):
self.polygons = []
for p in polygons:
p = PolygonInstance(p, size)
if len(p) > 0:
self.polygons.append(p)
# if len(p) > 0:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we have this change?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi,

I believe this modification anymore is no longer needed any more. I modified this because our json files have empty segmentation field in the annotations part with will break the logic here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have re-committed this pr, and new link is here. If you please, we could continue communicating in that page :)

beta=1,
).sum(dim=1, keepdim=True)
ohem_loss = classification_loss
ohem_loss[sampled_pos_inds_subset[:, None]] += box_loss
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would the very least not use an in-place operation, as it might break backpropagation depending on the operation.

You could replace it with something like what @WeihongM mentioned

@chucklu
Copy link

chucklu commented Jun 2, 2019

@CoinCheung You can get the delete branch in fork back through betaflight/betaflight#8112 (comment)

@LouieYang
Copy link

@CoinCheung Why the ohem_loss.size(0) compared with self.batch_size_per_image? It seems like ohem_loss includes all proposals from all batch images and maybe it should be torch.split to image-wise and then use top_k to select?

@WeihongM
Copy link

@CoinCheung same question with Louie Yang, think this implementation have some problem.

@facebook-github-bot facebook-github-bot deleted the branch facebookresearch:master September 1, 2021 11:41
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants