-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integrate toma for automatic batch sizing #1444
Comments
Since we have our own implementation of this now, I am closing this. |
Thanks for this PR to implement toma in PyTorch Lightning! If you copy code and ideas from my projects, could you please add a mention from it, too? I see that you're a fellow PhD student, so you are aware of the importance of credit assignment. In particular, if you copy code verbatim and remove helpful comments... maybe add them back.
is from https://github.com/BlackHC/toma/blob/master/toma/torch_cuda_memory.py. Now, I think this PR contains lots of other code, and I think it's great, but maybe add a link or a mention. Thank you, PS: Keeping my rather random method names is a bit of a give-away. |
@BlackHC i'm sorry, i had no idea this code was copied... @SkafteNicki generally we like to do our own implementations of features. I'm sure it wasn't done with bad intentions, so we just need to find something that works here? Maybe we forgot to add this to our contribution guidelines (@Borda mind updating?) I suggest a few things to rectify this: we use toma as is and add them as a dependency for this feature @BlackHC my deepest apologies, i was not aware that this code came from your repo! @PyTorchLightning/core-contributors thoughts? |
@BlackHC I am sorry for this, I was not aware of it... |
I want to thank everyone for replying so quickly. For future reference, most messages are in #1638. Thank you all! |
@BlackHC want to integrate into lightning?
https://github.com/BlackHC/toma
The text was updated successfully, but these errors were encountered: