Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using devika on Docker Compose With External Ollama Server #257

Closed
hqnicolas opened this issue Mar 29, 2024 · 9 comments
Closed

Using devika on Docker Compose With External Ollama Server #257

hqnicolas opened this issue Mar 29, 2024 · 9 comments

Comments

@hqnicolas
Copy link

hqnicolas commented Mar 29, 2024

congratulations, this devika project is an amazing piece of art!

All changes made to hqnicolas devika

Remove the Ollama server from docker compose

EDIT: docker-compose.yaml

version: "3.9"

services:
  devika-backend-engine:
    build:
      context: .
      dockerfile: devika.dockerfile
    expose:
      - 1337
    ports:
      - 1337:1337
    environment:
      - OLLAMA_HOST=http://192.168.0.21:11434
    healthcheck:
      test: ["CMD-SHELL", "curl -f http://localhost:1337/ || exit 1"]
      interval: 5s
      timeout: 30s
      retries: 5
      start_period: 30s
    volumes:
      - devika-backend-dbstore:/home/nonroot/devika/db
    networks:
      - devika-subnetwork

  devika-frontend-app:
    build:
      context: .
      dockerfile: app.dockerfile
      args:
        - VITE_API_BASE_URL=http://127.0.0.1:1337
    depends_on:
      - devika-backend-engine
    expose:
      - 3000
    ports:
      - 3000:3000
    networks:
      - devika-subnetwork

networks:
  devika-subnetwork:

volumes:
  devika-backend-dbstore:

Stop Messing with user on docker compose!

EDIT: devika.dockerfile

FROM debian:12

# setting up os env
USER root
WORKDIR /home/nonroot/devika

ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1

# setting up python3
RUN apt-get update && apt-get upgrade
RUN apt-get install -y build-essential software-properties-common curl sudo wget git
RUN apt-get install -y python3 python3-pip
RUN curl -fsSL https://astral.sh/uv/install.sh | sudo -E bash -
RUN $HOME/.cargo/bin/uv venv
ENV PATH="/home/nonroot/devika/.venv/bin:$HOME/.cargo/bin:$PATH"

# copy devika python engine only
RUN $HOME/.cargo/bin/uv venv
COPY requirements.txt /home/nonroot/devika/
RUN UV_HTTP_TIMEOUT=100000 $HOME/.cargo/bin/uv pip install -r requirements.txt 
RUN playwright install --with-deps chromium

COPY src /home/nonroot/devika/src
COPY config.toml /home/nonroot/devika/
COPY devika.py /home/nonroot/devika/
RUN chown -R root:root /home/nonroot/devika

USER root
WORKDIR /home/nonroot/devika
ENV PATH="/home/nonroot/devika/.venv/bin:$HOME/.cargo/bin:$PATH"
RUN mkdir /home/nonroot/devika/db

ENTRYPOINT [ "python3", "-m", "devika" ]

Make Sure that your Ollama server have this Models:

  • openchat:7b-v3.5-1210-q5_K_M (4.8GB)
  • mistral-openorca:7b-q5_K_M (4.8GB)
  • qwen:14b-chat-v1.5-q4_K_M (8.6GB)

Put your Bing API From
BING = "https://api.bing.microsoft.com/v7.0/search"

@hqnicolas
Copy link
Author

Captura de tela de 2024-03-29 20-59-14
Amazing Work!
Thank you!

@hqnicolas hqnicolas closed this as not planned Won't fix, can't repro, duplicate, stale Mar 30, 2024
@hqnicolas hqnicolas reopened this Mar 30, 2024
@hqnicolas hqnicolas changed the title Using devika With External Ollama Server Using devika on Docker Compose With External Ollama Server Mar 30, 2024
@MalteBoehm
Copy link

Great one, can it work on a M1?

@hqnicolas
Copy link
Author

hqnicolas commented Mar 30, 2024

can it work on a M1?

@MalteBoehm The only ARM that I test it is RK3566, and it works...
the Ollama was external on RX7800XT desktop

@ItsNeil17
Copy link

Great one, can it work on a M1?

I just tested it with a M1, and it does work. Took me some time to set it up, but after the setup it worked great.

@janvi2021
Copy link

image
not able to create project it says devika inactive

@subhajit20
Copy link

subhajit20 commented Apr 1, 2024

image not able to create project it says devika inactive

Same thing happening with me as well. After reselecting the project option the agent gets active but not being able to generate anything corresponding the request given by me. I ran that couple of time but nothing happening.

@hqnicolas
Copy link
Author

hqnicolas commented Apr 1, 2024

@subhajit20 and @janvi2021

The stitionai/devika works fine for me today
it was fixed on today's version I will fork it and apply the docker compose to external ollama
https://github.com/hqnicolas/devika

@sebj84
Copy link

sebj84 commented Apr 2, 2024

@hqnicolas thank you for the last update. Unfortunately I have the same symptoms as @subhajit20 and @janvi2021 with the last push (web page is UP so the front end is OK, but backend is not "fully" running.
NOTE : is there a specific reason to get openchat, mistral and xxx model on external ollama side? I did'nt saw theses LLM in config files (maybe it is the reason that the backend is not working...).
Unfortunately there is no debug on the backend container... It seems that the "healthcheck" on port 1337 is not UP (container "not healthy" but I can't see why the service is not working :(

@hqnicolas
Copy link
Author

hqnicolas commented May 11, 2024

I did'nt saw theses LLM in config files (maybe it is the reason that the backend is not working...). Unfortunately there is no debug on the backend container... It seems that the "healthcheck" on port 1337 is not UP (container "not healthy" but I can't see why the service is not working :(

I use everything in debian ubuntu 22.04 and 23.10
my ollama server is on another machine working for the web,
but I also use the opportunity to access it with devika.
I test a new version every month and push to my REPO every month.
Today I'm pushing a861328
will test and release with tutorial this time.
https://github.com/hqnicolas/devika

@sebj84

Make Sure that your Ollama server have this Models:

  • openchat:7b-v3.5-1210-q5_K_M (4.8GB)
  • mistral-openorca:7b-q5_K_M (4.8GB)
  • qwen:14b-chat-v1.5-q4_K_M (8.6GB)

Put your Bing API From
BING = "https://api.bing.microsoft.com/v7.0/search"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants