Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Aug 6, 2025 - Python
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
CodeGate: Security, Workspaces and Multiplexing for AI Agentic Frameworks
LLMs and Machine Learning done easily
Build reliable, secure, and production-ready AI apps easily.
The Open Source Firewall for LLMs. A self-hosted gateway to secure and control AI applications with powerful guardrails.
Aqueduct is a simple yet fully-featured AI gateway built on top of the LiteLLM Router SDK using Django.
Lightweight AI inference gateway - local model registry & parameter transformer (Python SDK) - with optional Envoy proxy processor and FastAPI registry server deployment options.
Gateway to control LLM API/SDK calls. Supports access to OpenAI, Azure, Anthropic and Bedrock calls
An MCP server 🚀 for MonkDB, licensed under Apache 2.0 📜
A lightweight AI model router for seamlessly switching between multiple AI providers (OpenAI, Anthropic, Google AI) with unified API interface.
MCP Connection Hub - A unified Model Context Protocol Gateway
An MCP server 🚀 for MonkDB, licensed under Apache 2.0 📜
Add a description, image, and links to the ai-gateway topic page so that developers can more easily learn about it.
To associate your repository with the ai-gateway topic, visit your repo's landing page and select "manage topics."