-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: Support for free and local LLMs (ollama) #95
Comments
Hi! That's actually in the works, I have a branch that already has a working prototype. I just need to document it. Feel free to try it and provide feedback 😄 pak::pak("mlverse/chattr@ollama")
library(chattr)
chattr_use("ollama")
chattr("hi") |
Hi! I followed your instructions but I cannot run Ollama.
ℹ No downloads are needed
── chattr
|
Rather than work specifically with ollama, could you allow defining the endpoint for the OpenAI connection? In addition to the endpoint, this should also then allow selection of the specific model to connect to and the litellm api key. I believe the OpenAI package does most of this. Inheriting this as the backend could allow chattr to focus on the front end RStudio / RShiny integration. This would allow connection to LiteLLM and from there proxy connection to ollama or a variety of other LLMs |
Thank you. How I do that?
*Manuel Spínola, Ph.D.*
Instituto Internacional en Conservación y Manejo de Vida Silvestre
Universidad Nacional
Apartado 1350-3000
Heredia
COSTA RICA
***@***.*** ***@***.***>
***@***.***
Teléfono: (506) 8706 - 4662
Sitio web institucional: ICOMVIS
<http://www.icomvis.una.ac.cr/index.php/manuel>
Sitio web personal: Sitio personal <https://mspinola-sitioweb.netlify.app>
Blog sobre Ciencia de Datos: Blog de Ciencia de Datos
<https://mspinola-ciencia-de-datos.netlify.app>
El El mar, 18 jun 2024 a la(s) 07:11, ga-it ***@***.***>
escribió:
… Rather than work specifically with ollama, could you allow defining the
endpoint for the OpenAI connection?
This would allow connection to LiteLLM and from there proxy connection to
ollama or a variety of other LLMs
—
Reply to this email directly, view it on GitHub
<#95 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFI3FB535XV5VQO3VJ4X6WLZIAWXNAVCNFSM6AAAAABHC5D5LGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNZWGA3DQMJUGY>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Now, if I follow:
***@***.***")
library(chattr)
chattr_use("ollama")
chattr("hi")
I can work with ollama
El mar, 18 jun 2024 a las 7:11, ga-it ***@***.***>) escribió:
… Rather than work specifically with ollama, could you allow defining the
endpoint for the OpenAI connection?
This would allow connection to LiteLLM and from there proxy connection to
ollama or a variety of other LLMs
—
Reply to this email directly, view it on GitHub
<#95 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFI3FB535XV5VQO3VJ4X6WLZIAWXNAVCNFSM6AAAAABHC5D5LGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNZWGA3DQMJUGY>
.
You are receiving this because you commented.Message ID:
***@***.***>
--
*Manuel Spínola, Ph.D.*
Instituto Internacional en Conservación y Manejo de Vida Silvestre
Universidad Nacional
Apartado 1350-3000
Heredia
COSTA RICA
***@***.*** ***@***.***>
***@***.***
Teléfono: (506) 8706 - 4662
Sitio web institucional: ICOMVIS
<http://www.icomvis.una.ac.cr/index.php/manuel>
Sitio web personal: Sitio personal <https://mspinola-sitioweb.netlify.app>
Blog sobre Ciencia de Datos: Blog de Ciencia de Datos
<https://mspinola-ciencia-de-datos.netlify.app>
|
Apologies - to clarify I was suggesting an alternative approach to @mlverse for the enhancement |
No problem!
*Manuel Spínola, Ph.D.*
Instituto Internacional en Conservación y Manejo de Vida Silvestre
Universidad Nacional
Apartado 1350-3000
Heredia
COSTA RICA
***@***.*** ***@***.***>
***@***.***
Teléfono: (506) 8706 - 4662
Sitio web institucional: ICOMVIS
<http://www.icomvis.una.ac.cr/index.php/manuel>
Sitio web personal: Sitio personal <https://mspinola-sitioweb.netlify.app>
Blog sobre Ciencia de Datos: Blog de Ciencia de Datos
<https://mspinola-ciencia-de-datos.netlify.app>
El El mar, 18 jun 2024 a la(s) 08:12, ga-it ***@***.***>
escribió:
… Apologies - to clarify I was suggesting an alternative approach to
@mlverse <https://github.com/mlverse> for the enhancement
—
Reply to this email directly, view it on GitHub
<#95 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFI3FB3XSAHVWSVWHH3QKLTZIA56PAVCNFSM6AAAAABHC5D5LGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNZWGIYDSMBSGU>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
@edgararuiz Is there a roadmap to merge the branch? Thanks! |
I would really appreciate this. Our university runs an open-ai API compatible server (vLLM). It would be great to use it by just setting the model endpoint url |
Ollama is a fantastic tool, that enables user to run freely available LLMs locally and chat with them via the command line. They regularly update what LLMs are available (llama3 became available this week)
My feature request is to enable the chattr app to interact with these local and open source instances.
The text was updated successfully, but these errors were encountered: