-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Removes need to unsqueeze from dp #1319
Conversation
This pull request is now in conflict... :( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! :)
@@ -199,3 +202,15 @@ def _worker(i, module, input, kwargs, device=None): | |||
raise output | |||
outputs.append(output) | |||
return outputs | |||
|
|||
|
|||
def auto_squeeze_dim_zeros(output): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about just "unsqueeze_scalars"?
Co-Authored-By: Adrian Wälchli <adrian.waelchli@students.unibe.ch>
This pull request is now in conflict... :( |
Codecov Report
@@ Coverage Diff @@
## master #1319 +/- ##
=======================================
+ Coverage 92% 92% +<1%
=======================================
Files 62 62
Lines 3239 3246 +7
=======================================
+ Hits 2964 2971 +7
Misses 275 275 |
After this change, is it still needed to suppress this warning? |
* removes need to unsqueeze from dp * removes need to unsqueeze from dp * fixed examples * added auto unsqueeze * added auto unsqueeze * added auto unsqueeze * added auto unsqueeze * Update pytorch_lightning/overrides/data_parallel.py Co-Authored-By: Adrian Wälchli <adrian.waelchli@students.unibe.ch> * fixed dp parse * fixed dp parse Co-authored-by: Adrian Wälchli <adrian.waelchli@students.unibe.ch>
No description provided.