Skip to content

Commit

Permalink
Merge pull request #105 from grafana/jtheory/readmes-pub-prev
Browse files Browse the repository at this point in the history
Readme updates for repo & plugin
  • Loading branch information
jtheory authored Oct 20, 2023
2 parents 9d77643 + 4b941a6 commit 4554a95
Show file tree
Hide file tree
Showing 2 changed files with 60 additions and 20 deletions.
55 changes: 48 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,27 @@
# Grafana LLM App (Experimental)
# Grafana LLM App (Public Preview)

A Grafana plugin designed to centralize access to LLMs, providing authentication, rate limiting, and more.
A Grafana plugin designed to centralize access to LLMs, providing authentication, proxying, streaming, and custom extensions.
Installing this plugin will enable various pieces of LLM-based functionality throughout Grafana.

Note: This plugin is **experimental**, and may change significantly between
versions, or be deprecated completely in favor of a different approach based on
user feedback.
Note: The Grafana LLM App plugin is currently in [Public preview](https://grafana.com/docs/release-life-cycle/). Grafana Labs offers support on a best-effort basis, and there might be breaking changes before the feature is generally available.

## Installing this plugin
## Install the plugin on Grafana Cloud

Prerequisites:
- Any Grafana Cloud environment (including Free)
- API connection details from an account with [OpenAI](https://platform.openai.com) or [Azure OpenAI](https://oai.azure.com/)

Steps:
1. In your Grafana instance, open Administration → Plugins
1. Select "All" instead of "Installed" and search for "LLM"
1. Click "Install via grafana.com"
1. On the [LLM's plugin page](https://grafana.com/grafana/plugins/grafana-llm-app/), you should see your instance listed; click "Install plugin"
1. Return to Grafana, and search installed plugins, reloading until the LLM plugin is listed (this may take a minute or two)
1. Configuration: choose your provider (OpenAI or Azure) and fill in the fields needed
1. Save settings, then click "Enable" (upper right) to enable the plugin


## Install the plugin directly

To install this plugin, use the `GF_INSTALL_PLUGINS` environment variable when running Grafana:

Expand Down Expand Up @@ -46,6 +60,33 @@ apps:
openAIKey: $OPENAI_API_KEY
```
### Using Azure OpenAI
To provision the plugin to use Azure OpenAI, use settings similar to this:
```yaml
apiVersion: 1

apps:
- type: 'grafana-llm-app'
disabled: false
jsonData:
openAI:
provider: azure
url: https://<resource>.openai.azure.com
azureModelMapping:
- ["gpt-3.5-turbo", "gpt-35-turbo"]
secureJsonData:
openAIKey: $OPENAI_API_KEY
```
where:
- `<resource>` is your Azure OpenAI resource name
- the `azureModelMapping` field contains `[model, deployment]` pairs so that features know
which Azure deployment to use in place of each model you wish to be used.


## Adding LLM features to your plugin or Grafana core

To make use of this plugin when adding LLM-based features, you can use the helper functions in the `@grafana/experimental` package.
Expand Down Expand Up @@ -218,4 +259,4 @@ To use the example app in conjunction with the LLM plugin:
- Add notes to changelog describing changes since last release
- Merge PR for a branch containing those changes into main
- Go to drone [here](https://drone.grafana.net/grafana/grafana-llm-app) and identify the build corresponding to the merge into main
- Promote to target 'publish'
- Promote to target 'publish'
25 changes: 12 additions & 13 deletions src/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Grafana LLM app (experimental)
# Grafana LLM app (public preview)

This is a Grafana application plugin which centralizes access to LLMs across Grafana.
This Grafana application plugin centralizes access to LLMs across Grafana.

It is responsible for:

Expand All @@ -11,32 +11,31 @@ It is responsible for:

Future functionality will include:

- support for multiple LLM providers, including the ability to choose your own at runtime
- support for additional LLM providers, including the ability to choose your own at runtime
- rate limiting of requests to LLMs, for cost control
- token and cost estimation
- RBAC to only allow certain users to use LLM functionality

Note: This plugin is **experimental**, and may change significantly between
versions, or deprecated completely in favor of a different approach based on
user feedback.
Note: The Grafana LLM App plugin is currently in [Public preview](https://grafana.com/docs/release-life-cycle/). Grafana Labs offers support on a best-effort basis, and there might be breaking changes before the feature is generally available.

## For users

Install and configure this plugin to enable various LLM-related functionality across Grafana.
This will include new functionality inside Grafana itself, such as explaining panels, or
in plugins, such as natural language query editors.
This includes new functionality inside Grafana itself, such as explaining panels, or in plugins,
such as natural language query editors.

All LLM requests will be routed via this plugin, which ensures the correct API key is being
used and rate limited appropriately.
used and requests are routed appropriately.

## For plugin developers

This plugin is not designed to be directly interacted with; instead, use the
convenience functions in the
[`@grafana/experimental`](https://www.npmjs.com/package/@grafana/experimental)
This plugin is not designed to be directly interacted with; instead, use the convenience functions
in the [`@grafana/experimental`](https://www.npmjs.com/package/@grafana/experimental)
package which will communicate with this plugin, if installed.

First, add the correct version of `@grafana/experimental` to your dependencies in package.json:
Looking for working examples? Check https://github.com/grafana/grafana-llmexamples-app

First, add the latest version of `@grafana/experimental` to your dependencies in package.json:

```json
{
Expand Down

0 comments on commit 4554a95

Please sign in to comment.