Native support for remote models #5790
ClaudioChicoDev
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Just a though.
With finetuning becoming cheaper by the day (QLorRA) there's an argument to use multiple models per project more than ever, but we are limited by the machine hardware.
It'd be great if LangChain offered a solution directly from the library to connect with remote models running elsewhere (over socket).
Ideally we'd be able to have a pool of models ready to go somewhere and langchain projects would be able to connect to them.
Beta Was this translation helpful? Give feedback.
All reactions