-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using devika on Docker Compose With External Ollama Server #257
Comments
Great one, can it work on a M1? |
@MalteBoehm The only ARM that I test it is RK3566, and it works... |
I just tested it with a M1, and it does work. Took me some time to set it up, but after the setup it worked great. |
The stitionai/devika works fine for me today |
@hqnicolas thank you for the last update. Unfortunately I have the same symptoms as @subhajit20 and @janvi2021 with the last push (web page is UP so the front end is OK, but backend is not "fully" running. |
I use everything in debian ubuntu 22.04 and 23.10 Make Sure that your Ollama server have this Models:
Put your Bing API From |
congratulations, this devika project is an amazing piece of art!
All changes made to hqnicolas devika
Remove the Ollama server from docker compose
EDIT: docker-compose.yaml
Stop Messing with user on docker compose!
EDIT: devika.dockerfile
Make Sure that your Ollama server have this Models:
Put your Bing API From
BING = "https://api.bing.microsoft.com/v7.0/search"
The text was updated successfully, but these errors were encountered: