You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, Jupyter AI primarily relies on cloud-based models, which can be limiting for users who have custom LLM models stored locally. Do we support for using locally available models in our file system if not we should add it.
Proposed Solution
Please add support to make use of locally available LLM models or if its already there please provide detailed steps on how to use it or direct to any available documentation.
The text was updated successfully, but these errors were encountered:
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗
If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! 👋
If you would like to use local models via a mechanism other than GPT4All, or if you have problems with the current GPT4All implementation, please add comments to existing issues or open new issues. Thanks for your interest in Jupyter AI!
Problem
Currently, Jupyter AI primarily relies on cloud-based models, which can be limiting for users who have custom LLM models stored locally. Do we support for using locally available models in our file system if not we should add it.
Proposed Solution
Please add support to make use of locally available LLM models or if its already there please provide detailed steps on how to use it or direct to any available documentation.
The text was updated successfully, but these errors were encountered: