Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I have 12 GB of VRAM, yet I'm still experiencing an out-of-memory (OOM) issue. Can someone specify how much RAM I need for it? #32

Closed
SAT431 opened this issue Oct 11, 2024 · 4 comments

Comments

@SAT431
Copy link

SAT431 commented Oct 11, 2024

No description provided.

@Hussain-X
Copy link

Are you using cpu_offloading=True parameter?
As mentioned below:
#23

@SAT431
Copy link
Author

SAT431 commented Oct 11, 2024

Are you using cpu_offloading=True parameter? As mentioned below: #23

yes i used cpu_offloading=True and i have 24gb of ram and 12gb of vram. when it is trying to load models i getting like this: (myenv) sathvik@sathvik:/mnt/c/Users/pasun/Music/Pyramid-Flow$ python app.py
Model directory '/mnt/c/Users/pasun/Music/Pyramid-Flow/pyramid_flow_model' already exists. Skipping download.
using half precision
Using temporal causal attention
We interp the position embedding of condition latents
You set add_prefix_space. The tokenizer needs to be converted from the slow tokenizers
Loading checkpoint shards: 50%|█████████████████████████████ | 1/2 [00:45<00:45, 45.82s/it]Killed

@FurkanGozukara
Copy link

@SAT431 killed is out of ram error

You nerd buy more ram

@SAT431
Copy link
Author

SAT431 commented Oct 11, 2024

FurkanGozukara

thank you

@SAT431 SAT431 closed this as completed Oct 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants