-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: RuntimeError: Torch is not able to use GPU #15057
Comments
Hi. Check if this error started appearing after installing some extension? For example - sd-wav2lip-uhq |
This is with a fresh install, my extensions folder is completely empty, I also updated today to the latest version released yesterday and still no luck |
I found a solution for me...
OBS: My current installed version of WebUI is |
Unfortunately, this does not fix it for me |
Hi, having the same issue - tried on Arch, windows 10 and clean install of Ubuntu 22.04 lts. On Ubuntu, if I add the |
Can you send the full log again? |
The log doesn't change other than it not showing the python version mismatch, however, I do know it accepts higher versions than the base 3.10.6 version as it did before this break/error started showing up |
set CUDA_PATH=venv\Lib\site-packages\torch\lib or yours in .bat |
Strangely enough, when I run the pyhton import torch and print if cuda is available it says false, however, I do know Cuda is installed on my computer and when running command Edit: I can also confirm Edit:
I then deleted my venv folder and restart and I still get the error. Reinstalling stable diffusion completely also does not work. |
I think I'm getting closer to fixing it! I have in my I ran I upgraded python to 3.11.7 With all of the above I no longer have However, this brings up a new problem I am getting (attached). |
Closing due to new issue likely being separate issue |
Checklist
What happened?
Attempting to launch webui-user.bat generates an error:
RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check
I have added
--skip-torch-cuda
in the webui-user.sh for the command line arguments and it does not fix it, if I edit the .bat file it does fix it but does not use my GPU.Steps to reproduce the problem
What should have happened?
Stable diffusion should have started up and able to use my GPU.
What browsers do you use to access the UI ?
Google Chrome
Sysinfo
Using --dump-sysinfo does not work in either the webui-user.sh or .bat file, however, it does work after editing
/stable-diffusion-webui/repositories/stable-diffusion-stability-ai/ldm/models/diffusion/ddpm.py
changingpytorch_lightning.utilities.distributed to pytorch_lightning.utilities.rank_zero
and/stable-diffusion-webui/extensions-builtin/LDSR/sd_hijack_ddpm_v1.py
Edit: Disregard, even after performing this fix mentioned in issue #11458 I cannot get it to work, however, this is likely a separate issue than this CUDA problem.
sysinfo-2024-02-29-05-12.json
Console logs
Additional information
I believe this may be because of updating to the new beta Nvidia App which takes over the control panel and GeForce Experience, however, I cannot confirm this as it has been a while since my last use.
The text was updated successfully, but these errors were encountered: