Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help to get OpenHands working with local LLM #6918

Open
grannymaster opened this issue Feb 24, 2025 · 6 comments
Open

Help to get OpenHands working with local LLM #6918

grannymaster opened this issue Feb 24, 2025 · 6 comments
Labels
troubleshooting/help User requires help

Comments

@grannymaster
Copy link

grannymaster commented Feb 24, 2025

Hey,

I'm having some trouble getting OpenHands working with local ollama instance.

My whole setup is running on WSL in Ubuntu machine.

Ollama:
I installed ollama using the default command from their website:

curl -fsSL https://ollama.com/install.sh | sh

These are models I have installed:

ollama list
NAME                ID              SIZE      MODIFIED
qwen2.5-coder:7b    2b0496514337    4.7 GB    4 days ago
llama3:latest       365c0bd3c000    4.7 GB    6 weeks ago

Image

Open WebUi:
Installied by following NetworkChuks's instructions, this is the docker container run command:

docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

OpenHands
For OpenHands I used Docker compose:

name: openhands
services:
    openhands:
        stdin_open: true
        tty: true
        pull_policy: always
        environment:
            - SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.25-nikolaik
            - LOG_ALL_EVENTS=true
            - LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434"
        volumes:
            - /var/run/docker.sock:/var/run/docker.sock
            - ~/.openhands-state:/.openhands-state
        ports:
            - 3000:3000
        extra_hosts:
            - host.docker.internal:host-gateway
        container_name: openhands-app
        image: docker.all-hands.dev/all-hands-ai/openhands:0.25

OpenHands provider settings:

Image

One thing I'm not sure is the API key, as I've not set one up at any point, but according to things I found online, setting it to "ollama" should be good enough. This might be complitely wrong.

When I type my prompt for OpenHands, this the error I get:

RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is 6909e00c6e9c41d9a1660ae705ad26c7. Error type: APIConnectionError

@grannymaster
Copy link
Author

Also tried this and 127.0.0.1

Image

@enyst
Copy link
Collaborator

enyst commented Feb 24, 2025

Please check out this, we've been trying to gather instructions that work, depending how you're running. Just in case it may matter (maybe it doesn't), you may want to try running with the command in the README in this repo.
https://docs.all-hands.dev/modules/usage/llms/local-llms

@grannymaster
Copy link
Author

Sadly I'm getting the same error, I tried running the container using "Docker run" instead of compose.

Attempt #1

sudo docker run -it --rm --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.25-nikolaik \
    -e LOG_ALL_EVENTS=true \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -v ~/.openhands-state:/.openhands-state \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app \
	--add-host host.docker.internal:host-gateway \
    -e LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434" \
    docker.all-hands.dev/all-hands-ai/openhands:0.25

Attempt #2

sudo docker run -it --rm --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.25-nikolaik \
    -e LOG_ALL_EVENTS=true \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -v ~/.openhands-state:/.openhands-state \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app \
	--add-host host.docker.internal:host-gateway \
    -e LLM_OLLAMA_BASE_URL="http://localhost:11434" \
    docker.all-hands.dev/all-hands-ai/openhands:0.25

@enyst
Copy link
Collaborator

enyst commented Feb 24, 2025

Not sure what the problem is, but just double-checking: it does need http://host.docker.internal:11434 as base url in the UI Settings I think. And the model, yes, as you set it, with ollama/ and then the exact name reported by ollama list.

Could you please also double-check this section: https://docs.all-hands.dev/modules/usage/llms/local-llms#configuring-ollama-service-wsl-en
to enable Ollama service to be accessible.

@grannymaster
Copy link
Author

grannymaster commented Feb 24, 2025

Oh shit, I missed the the fixing of the WSL stuff.

Now tho when I prompt stuff I get this:

Parameter 'command' is expected to be one of ['view', 'create', 'str_replace', 'insert', 'undo_edit'].

EDIT: Seems to be an issue when using ollama\llama3:latest

@enyst
Copy link
Collaborator

enyst commented Feb 24, 2025

That one is from the LLM itself. I'm afraid openhands needs a relatively powerful LLM that obeys instructions. I don't know the quirks of llama3, maybe it's temporary? I mean, you could perhaps tell the LLM to pay attention to the exact definitions it needs to follow?

@mamoodi mamoodi added the troubleshooting/help User requires help label Feb 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
troubleshooting/help User requires help
Projects
None yet
Development

No branches or pull requests

3 participants