Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Putting torch dependency behind an Extra flag #1449

Open
istrupin opened this issue Dec 5, 2024 · 6 comments
Open

Putting torch dependency behind an Extra flag #1449

istrupin opened this issue Dec 5, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@istrupin
Copy link

istrupin commented Dec 5, 2024

🚀 The feature

In order to avoid a potentially ~6 GB library size, I'd like to put the torch dependency behind an Extra flag, so users could opt into it.

Motivation, pitch

When torch was added as a dependency in 2.3.1, the library size increases dramatically. On my Apple silicon mac, the size went from ~600mb to ~2gb. When running on a fresh Debian build box, the size was ~6gb (as torch will pull in nvidia dependencies on linux)

Looking through the code, it seems that torch is not explicitly used anywhere, so I imagine it is a transitive dependency for another feature.

I'd like to suggest putting the torch dependency behind an Extra flag (similar to the Excel extra, for example), so users could opt into it if they don't mind the large size.

Alternatives

Currently, I downgraded to 2.3.0, which does not have the torch dependency. I also tried going into my libs and removing torch/nvidia dependencies manually which drastically reduced image size, and core functionality seemed to be unharmed.

Additional context

No response

@dosubot dosubot bot added the enhancement New feature or request label Dec 5, 2024
@Ashish0898
Copy link

Just wanted to check if it was implemented or not.

If yes, can you please share the docs/code snippet for this

In our case also, docker image size is getting increased to ~6GB.

Any assistance on this would be really helpful

@istrupin
Copy link
Author

Not to my knowledge, unfortunately. I ended up pinning my dependency to an old version. I think this would be a worthwhile feature to add though, especially going into V3.0.

@Ashish0898
Copy link

Ahh! Bummer!

We just went crazy with our workaround, we decided we want to keep the latest version.
So we installed the latest version and then using pip-autoremove library we removed torch and it's dependencies.

Not sure how good of a solution is this but so far it seems to be working for us but I guess eventually we'll also downgrade it and pin it to an older version.

@istrupin
Copy link
Author

I haven't seen a comment on this issue from any of the official maintainers -- I suppose I could also just submit a PR, but would love to get the official go-ahead first, even if just to know what version to submit against. My build-tool skills are a little rusty but I'm sure I could figure something out.

Thanks for the tip about pip-autoremove though, that's not a bad workaround either.

@Ashish0898
Copy link

I'm more than happy to work with you on this!!!

I just checked the Open PRs they're raising PRs based on the latest version, I guess we can also use it

@gventuri
Copy link
Collaborator

@istrupin @Ashish0898 good catch. I had a quick look and seems to be added by safetensors, from the transformers library. Now I think transformers library should not be used as of v3.0 (but need to check), so I'm quite sure we can get a rid of it.

If anyone wants to join in the effort of exploring this matter further, it would be great!

Will keep you posted!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants