Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
-
Updated
Nov 4, 2024 - Dart
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
Olares: An Open-Source Sovereign Cloud OS for Local AI
MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.
MemoryCache is an experimental development project to turn a local desktop environment into an on-device AI agent
🦙 Ollama Telegram bot, with advanced configuration
Like ChatGPT's voice conversations with an AI, but entirely offline/private/trade-secret-friendly, using local AI models such as LLama 2 and Whisper
Extract structured data from local or remote LLM models
MVP of an idea using multiple local LLM models to simulate and play D&D
Evolve your OpenCV Image Processing filters using Cartesian Genetic Programming
Search your favorite websites and chat with them, on your desktop🌐
Empower Your Productivity with Local AI Assistants
Blueprint by Mozilla.ai for generating podcasts from documents using local AI
Visual summaries for code repositories
A browser extension that brings local AI to your device
🤖 AI-powered macOS automation framework - Control your Mac with natural language using GPT models. No code needed, just English instructions!
Telegram bot that interacts with the local Ollama 🦙 to answer user messages
Add a description, image, and links to the local-ai topic page so that developers can more easily learn about it.
To associate your repository with the local-ai topic, visit your repo's landing page and select "manage topics."