From af492a9e037ceb27173a7243207df581d91b671d Mon Sep 17 00:00:00 2001 From: Automated Date: Fri, 26 Jan 2024 22:13:34 +0000 Subject: [PATCH] Doc changes from 81225ab94bb8937935a053edcecb705ccdf7edea --- docs/plugins/directory.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/plugins/directory.md b/docs/plugins/directory.md index cfb40e2d..aa9fb276 100644 --- a/docs/plugins/directory.md +++ b/docs/plugins/directory.md @@ -26,6 +26,7 @@ These plugins can be used to interact with remotely hosted models via their API: - **[llm-openrouter](https://github.com/simonw/llm-openrouter)** provides access to models hosted on [OpenRouter](https://openrouter.ai/). - **[llm-cohere](https://github.com/Accudio/llm-cohere)** by Alistair Shepherd provides `cohere-generate` and `cohere-summarize` API models, powered by [Cohere](https://cohere.com/). - **[llm-bedrock-anthropic](https://github.com/sblakey/llm-bedrock-anthropic)** adds support for Claude and Claude Instant by Anthropic via Amazon Bedrock. +- **[llm-bedrock-meta](https://github.com/flabat/llm-bedrock-meta)** adds support for Llama 2 by Meta via Amazon Bedrock. If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin.