A simple OpenAI chat client for using in automated flows.
Define AI assistant configurations in Directus headless CMS and use them by calling this app via simple HTTP API, just like:
POST http://localhost:8080/ HTTP/1.1
Content-Type: application/x-www-form-urlencoded
q=Github&a=en_dict&t=***
In this example:
http://localhost:8080
- URL of an instance of this applicationq
- prompt to the AIa
- assistant code defined in Directust
- Directus user API token (defined in the back office)
Application will send the prompt to the AI, get completion, save it to Directus DB and return the completion ID. The result can be viewed at http://localhost:8080/<ID>
.
As for Directus, there are just two collections used: qai_assistant
and qai_completion_cache
. Corresponding interfaces can be found in directus.ts
.
Application supports AI configurations via HashiCorp's Vault or environment variables. See .env-cmdrc.example
file.
I personally use it as a part of Tasker's AutoShare command handlers. Selecting text in any app and then sharing it to a pre-configured AI assistant: dictionary, web summary, translator, etc.
CSS borrowed from beautiful Cactus theme.