Skip to content
#

local-server

Here are 2 public repositories matching this topic...

ollama-local-docker

Ollama Local Docker - A simple Docker-based setup for running Ollama's API locally with a web-based UI. Easily deploy and interact with Llama models like llama3.2 and llama3.2:1b on your local machine. This repository provides an efficient, containerized solution for testing and developing AI models using Ollama.

  • Updated Oct 12, 2024
  • HTML

Improve this page

Add a description, image, and links to the local-server topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the local-server topic, visit your repo's landing page and select "manage topics."

Learn more