Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss_contrast #54

Open
Summer77723 opened this issue Aug 10, 2022 · 2 comments
Open

loss_contrast #54

Summer77723 opened this issue Aug 10, 2022 · 2 comments

Comments

@Summer77723
Copy link

Hello. Thanks for your great work.
I have some questions about the code,what is the meaning of with_embed,Why is loss_contrast not used?
1.if with_embed is True:
return loss + self.loss_weight * loss_contrast

    return loss + 0 * loss_contrast  # just a trick to avoid errors in distributed training
  1.        if is_distributed():
             import torch.distributed as dist
             def reduce_tensor(inp):
                 """
                 Reduce the loss from all processes so that 
                 process with rank 0 has the averaged results.
                 """
                 world_size = get_world_size()
                 if world_size < 2:
                     return inp
                 with torch.no_grad():
                     reduced_inp = inp
                     dist.reduce(reduced_inp, dst=0)
                 return reduced_inp
    
             loss = self.pixel_loss(outputs, targets, with_embed=with_embed)  
             
             backward_loss = loss
             display_loss = reduce_tensor(backward_loss) / get_world_size()
         else:
             backward_loss = display_loss = self.pixel_loss(outputs, targets)
    

Looking forword to your reply!

@tfzhou
Copy link
Owner

tfzhou commented Sep 1, 2022

Hi, @Summer77723, our code has a warmup stage, in which the cotrastive loss is not applied, i.e., the weight of contrastive loss is zero.

@Summer77723
Copy link
Author

Thank you for your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants