Skip to content
This repository has been archived by the owner on Sep 27, 2024. It is now read-only.

please support airllm? #5

Open
showkeyjar opened this issue Nov 28, 2023 · 0 comments
Open

please support airllm? #5

showkeyjar opened this issue Nov 28, 2023 · 0 comments

Comments

@showkeyjar
Copy link

airllm, it said open source, and can inference 70B LLM with 4GB single GPU.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant