-
Notifications
You must be signed in to change notification settings - Fork 26.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[several models] improve readability #24585
Conversation
The documentation is not available anymore as the PR was closed or merged. |
It's actually mentioned, but I agree it's a bit less explicit than |
@@ -977,7 +977,7 @@ def __init__(self, config: CLIPConfig): | |||
|
|||
self.visual_projection = nn.Linear(self.vision_embed_dim, self.projection_dim, bias=False) | |||
self.text_projection = nn.Linear(self.text_embed_dim, self.projection_dim, bias=False) | |||
self.logit_scale = nn.Parameter(torch.ones([]) * self.config.logit_scale_init_value) | |||
self.logit_scale = nn.Parameter(torch.tensor(1.0) * self.config.logit_scale_init_value) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's change all such occurrences? Most of them are in clip-family models.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Totally!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks
Gentlemen, sleeping more on it, actually it just came to me that the cleanest most readable solution is just:
Do you agree? and yes, I can then fix other places. |
ok, I have used:
pattern everywhere I found Please have a look. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot!
|
Nice finding :-) |
Honestly I had no idea what
torch.ones([]) * self.config.logit_scale_init_value
would return - it's not documented either.Proposing to change it to a very cleartorch.tensor(1.0)
which leaves no doubt to what it does.Proposing to change it to a very clear
torch.tensor(self.config.logit_scale_init_value)
which leaves no doubt to what it does.