You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Excellent work! I just wonder if there is any way to load the model in more than one GPU because even the 7B model consumes more than 20G memory, which is larger than memory of one GPU.
The text was updated successfully, but these errors were encountered:
Hi @David-Zeng-Zijian, the current model requires around 24GB memory to run on a single GPU. Some possible solutions include model quantization and model parallel. It would be great if you can share us some of your solutions to deploy PandaGPT across multiple GPUs.
Excellent work! I just wonder if there is any way to load the model in more than one GPU because even the 7B model consumes more than 20G memory, which is larger than memory of one GPU.
The text was updated successfully, but these errors were encountered: