Robby provides an extensible Emacs interface to OpenAI.
Robby provides a set of functions to generate prompts and act on AI responses, and a macro and other utilities to assemble these into reusable AI commands. Robby uses these facilities to build core commands to operate on regions and such, and task-specific commands to do things like fixing code, generating comments, summarizing text, etc. Robby also provides an interactive builder to create custom commands without having to write code.
- About
- Requirements
- Installation
- Usage
- Building Custom Commands with Robby Builder
- Defining Custom Commands
- Customization
- Animated Gifs
- Emacs 29.1 or higher
Robby will use curl
to make HTTP requests if available, otherwise, it will fall back to url-retrieve
. Streaming responses are only supported with curl
.
To install robby in Emacs 29 versions, first install via package-vc-install
and then turn on robby-mode
:
(if (not (package-installed-p 'robby))
(package-vc-install "https://github.com/stevemolitor/robby"))
(robby-mode)
;; (optional) bind a prefix key to the robby-command-map, default is "C-c C-r":
(global-set-key (kbd "s-r") robby-command-map)
;; (optional) show robby view window in a side window on the right side:
(add-to-list 'display-buffer-alist
'("\\*robby\\*"
(display-buffer-in-side-window)
(side . right)
(window-width . 0.4)))
To install robby in Emacs 30, first install via use-package
with the :vc
option,
and then turn on robby-mode
:
(use-package robby
:vc (:url "https://github.com/stevemolitor/robby"
:branch "main"
:rev :newest)
:bind-keymap
;; (optional) bind a prefix key to the robby-command-map, default is "C-c C-r":
("s-r" . robby-command-map)
:config
(robby-mode)
;; (optional) show robby view window in a side window on the right side:
(add-to-list 'display-buffer-alist
'("\\*robby\\*"
(display-buffer-in-side-window)
(side . right)
(window-width . 0.4))))
You need an OpenAI API key to use robby
. You can set the variable
robby-openai-api-key
to the key, or set it to a function that returns the key.
;; not very secure, see below re using auth sources for a better approach:
(setq robby-openai-api-key "my-key")
By default robby-openai-api-key
is set to the function
robby-get-api-key-from-auth-source
which reads from your Auth Source file,
~/.authinfo
or ~/.netrc
. You need to add a line to that file that looks like this:
machine api.openai.com login API key password OPENAI_API_KEY
For instructions on using keys securely within Emacs using GPG and Auth Sources see
the “Mastering Emacs” article Keeping Secrets in Emacs with GnuPG and Auth
Sources. Robby will decrypt and read the API key from the encrypted
~/.authinfo.gpg
if you have set that up. This is what I do.
Robby comes with the following generic built-in commands:
Ask for a prompt in the minibuffer, send the prompt to OpenAI, and display the response in the minibuffer. Maintain conversational history of previous prompts and responses, up to robby-max-history
prompt/response pairs.
Query AI from the region, and respond in a read-only markdown view window. Maintain conversational history of previous prompts and responses.
You can refine the response by typing v
:
Like robby-chat
, but reads prompt from the current region, or the entire buffer if no active region. You can supply an optional prompt prefix from the minibuffer, to provide extra context or instructions.
Query AI from the region, prefix the selection region with the response, or insert at point if no selected region. If no selected region read prompt from current buffer. You can supply an optional prompt prefix from the minibuffer, to provide extra context or instructions.
Query AI from the region, prefix region with the response, or insert at point if no selected region. If no selected region read prompt from current buffer. You can supply an optional prompt prefix from the minibuffer.
Query AI from region, prefix region with the response. If no selected region read prompt from current buffer. You can supply an optional prompt prefix from the minibuffer, to provide extra context or instructions.
If a prefix argument is supplied, robby will display the changes in a diff buffer and ask for confirmation before applying.
Robby also includes a handful of example commands for specific tasks. You can use these as inspiration when creating your commands.
See robby-commands.el for the definitions of robby’s commands. You may want to copy and paste and then adjust the prompts to suit your needs or use them as inspiration for your commands.
Here is the list of example commands:
Describe code in the selected region, show description in robby view window.
Fix code in the selected region.
Preview changes in a diff buffer when invoked with a prefix argument.
Generate git commit message title from staged changes.
Add a comment for the code in the selected region or buffer. Preview changes in a diff buffer when invoked with a prefix argument.
Write some tests for the code in the region, and append them to the region.
Summarize the text in the selected region or entire buffer if no selected region, show description in robby view window.
Proof read the text in the selected region.
Preview changes in a diff buffer when invoked with a prefix argument.
M-x robby-commands
will display a transient menu for executing the robby core and task-specific commands:
To see command options (currently just one) set the transient level to 4
:
With the d
option selected, before fixing code or proofreading text, robby will
display the changes in a diff buffer and ask for confirmation before applying.
Robby passes the conversation history of previous messages to OpenAI.
Conversation history is local to the output buffer of the command. For most
commands this is the current buffer, but for robby-chat
and
robby-chat-from-region
it is the *robby*
robby view output buffer.
You can clear the history for a buffer with the robby-clear-history
command.
Note that commands can opt out of conversation history by setting the historyp
option to nil
.
The robby-max-history
customization variable specifies the maximum number of
previous prompt/response pairs to keep in the conversation history. Its default
is 2. Increasing this value will pass more history context to OpenAI, at the
cost of using more tokens. Setting it to 0 to turn conversation history off.
Set robby-logging
to t
to enable logging. Robby will log OpenAI requests and responses in the *robby-log*
buffer.
Running robby-builder
(C-c C-r b
)will bring up a transient menu to build and execute robby commands interactively. You can use this to tune your prompt, API options, and such. When you are satisfied with the result you can save the command via robby-insert-last-command
:
To see advanced options or set the transient level to 6
:
Press A
in the builder to see a menu of chat API options. For example, you can select which chat model to use. The first time you customize the model from the builder robby will fetch the list of models available to your account. Press m
to pick a different model:
You can experiment with the various chat API options to tune a particular
command. For example, for certain commands, you may want to set the
robby-chat-tempature
to 0
to produce more deterministic results. For other
commands, you may want to choose a different model, higher max tokens etc. See the
OpenAI Chat API Documentation for details on the various options. When you save
a command via robby-insert-last-command
the API options you used will be
persisted with the command definition.
When you select m
to select a model, robby will fetch the models available to you from OpenAI:
You can also access the API options transient directly via M-x robby-api-options
, or by customizing the robby-chat
customization group.
Use the robby-define-command
macro to define custom robby commands. Here is a simple example:
(require 'cl-macs)
(robby-define-command
what-is-emacs
"Tell me what emacs is. Print response in minbuffer"
:prompt "What is emacs?"
:action (cl-function (lambda (&key text &allow-other-keys)
(message text)))
:never-stream-p t)
The first argument is the name of the interactive command, the second is the docstring.
The :prompt
can either be a string or a function. If it’s a string, that string
is sent to OpenAI as the prompt. If it’s a function, the result of calling that
function is used as the prompt. The :action
function does something with the
response. We turn off streaming with :never-stream-p t
. (We’ll talk about
streaming responses below.)
Here’s a command that reads the prompt from the minibuffer, and responds in the minibuffer. It’s a slightly simplified version of robby-message
:
(cl-defun get-prompt-from-minibuffer (&rest)
"Get Robby prompt from minibuffer."
(read-string "Request for AI overlords: "))
(cl-defun respond-with-message (&key text &allow-other-keys)
"Print TEXT in minibuffer."
(message text))
(robby-define-command
ask-ai
"Read prompt from minibuffer, print response to minibuffer "
:prompt #'get-prompt-from-minibuffer
:action #'respond-with-message
:never-stream-p t)
To handle streaming responses our action function needs to handle receiving the response in chunks. Here is an example of a command that streams the response after the selected region, or at the point if no region is selected:
(cl-defun stream-after-region (&key text beg end chars-processed &allow-other-keys)
"Stream response after region."
(goto-char (+ end chars-processed))
(insert text))
(robby-define-command
append-response
"Read prompt from minibuffer, append response to selected region, or point if no region."
:prompt #'get-prompt-from-minibuffer
:action #'stream-after-region)
With streaming responses, text
is the current chunk. The action will be called
repeatedly for each chunk received.
The beg
and end
arguments are the start and end of the region when the command
was invoked, or the point if no selected region. Note that robby commands are
asynchronous, so the region or point may have changed by the time the response
comes back.
The :chars-processed
argument records the number of characters previously
received and processed, so you can calculate where to put the next chunk.
You can use a grounding function to process the text response after receiving it from OpenAI, but before sending it to the action. This can help clean up responses before displaying them to the user. For example, robby provides a format-message-text
grounding function to escape any %
characters to avoid messing up the message
function:
(defun robby-format-message-text (response)
"Replace % with %% in TEXT to avoid format string errors calling `message."
(replace-regexp-in-string "%" "%%" response))
(robby-define-command
ask-ai
"Read prompt from minibuffer, print response to minibuffer "
:prompt #'get-prompt-from-minibuffer
:action #'respond-with-message
:never-stream-p t
:grounding-fns #'robby-format-message-text)
The :grounding-fns
option takes either a list of grounding functions that will
be executed in order, or a single grounding function as shown above.
The prompt or action options can do more than just operate on the
selected region. For example, the robby-git-commit-message
function
invokes a shell command to get the list of staged changes in a git
repository and generates a one-line git commit message:
(cl-defun robby-get-prompt-from-git-diff (&key prompt-prefix &allow-other-keys)
(let* ((dir (locate-dominating-file default-directory ".git"))
(diff (shell-command-to-string (format "cd %s && git diff --staged" dir))))
(format "%s\n%s" prompt-prefix diff)))
(robby-define-command
robby-git-commit-message
"Generate git commit message title."
:prompt
#'robby-get-prompt-from-git-diff
:action
#'robby-prepend-response-to-region
:prompt-args
'(:prompt-prefix "For the following git diff, provide a concise and precise commit title capturing the essence of the changes in less than 50 characters.\n")
:grounding-fns #'robby-remove-quotes
:never-stream-p t)
You pass custom OpenAI API options in the :options
property list when defining a custom command. For example this command sets the OpenAI max_tokens
property to 2000
, just for this command:
(robby-define-command
robby-describe-code
"Describe code in the selected region, show description in help window."
:historyp nil
:prompt #'robby-get-prompt-from-region
:prompt-args '(:prompt-prefix "Describe the following code: ")
:action #'robby-respond-with-robby-chat
:api-options '(:max-tokens 2000))
Here is the complete list of command options:
If a function, the command will call it with the interactive prefix argument to obtain the prompt. If a string, it grabs the prompt from the region or the entire buffer context if no region, and prefixes the region text with the PROMPT string to build the prompt.
Prompt functions take the following keyword arguments:
arg
- The prefix arg, if any, for the invoked command.prompt-prefix
- String to prepend to the prompt.prompt-suffix
- String to append to the prompt.prompt-buffer
- The buffer to get prompt from. Usually, this is the current buffer, but commands can specify other buffers.never-ask-p
- Prefix functions likerobby-get-prompt-from-region
ask the user for a prompt prefix before executing the command. Passnever-ask-p t
to turn that behavior off.
Before sending the prompt to AI robby will format the prompt using format-spec
with the following format specs:
e
- current file extensionf
- current buffer file namel
- user login namen
- user full name
You can use these format specifiers in your own prompts to provide extra context. For example here is a simplified version of robby-add-comment
that passes the file extension by using the %e
format spec:
(robby-define-command
add-comment
"Add a comment for the code in the selected region or buffer."
:historyp nil
:never-stream-p t
:prompt #'robby-get-prompt-from-region
:prompt-args '(:prompt-prefix
"The file extension for this code is \"%e\". First, determine the programming language based on the file extension. Then, write a documentation comment for the code with the appropriate comment delimeters for the programming language. Here is the code:
")
:action #'robby-prepend-response-to-region)
- Type: Function.
- Description: The function to invoke when the request is complete. The function is passed the response text and the selected region. Must be of the form ‘(TEXT BEG END)’.
Action functions take the following keyword options:
arg
- The prefix arg, if any, for the invoked command.text
- The response text received from OpenAI. For streaming responses, this will be the current chunk.beg
- The beginning of the response region, an integer. This tells action functions where to start inserting or replacing text.end
- The end position of the response region, an integer.chars-processed
- For streaming responses, the number of characters already processed. Actions can usechars-processed
+beg
to calculate where to insert the next chunk.completep
- For streaming responses, indicates if the response is complete. On the last chunkcompletep
will bet
.
- type: Property list.
- Description: Options to pass to the OpenAI API. These options are merged with the customization options specified in either the ‘robby-chat-api’ or ‘robby-completions-api’ customization group.
- Type: Not specified.
- Description: Used to format the response from OpenAI before returning it. Only used if ‘NEVER-STREAM-P’ is true.
- Type: Regular expression.
- Description: If the response matches this pattern, do not perform the action. Useful with prompts that instruct OpenAI to respond with a certain message if there is nothing to do.
- Type: String (Optional).
- Description: The message to display when NO-OP-PATTERN matches.
- Type: Boolean.
- Description: Include conversation history in the OpenAI request if true.
- Type: Boolean.
- Description: Stream response if true. Overrides the ‘robby-stream’ customization variable if present.
Get Robby prompt from minibuffer.
“Get robby prompt from buffer region. If no selected region return all text in buffer.”
Get prompt from region, or entire buffer if no selected region.
If supplied PROMPT-PREFIX and/or PROMPT-PREFIX are prepended or appended to the buffer or region text to make the complete prompt.
If both PROMPT-PREFIX and PROMPT-SUFFIX are nil or not specified, prompt the user for a prompt prefix in the minibuffer.
Show TEXT in minibuffer message.
Prepend AI response to region, or insert at point if no selected region.
Append AI response to region, or insert at point if no selected region.
Replace region with AI response, or insert at point no selected region.
Show TEXT in robby-chat-mode
buffer.
Extract the text between the first pair of fenced code blocks in RESPONSE.
Extract the text between the first pair of fenced code blocks in RESPONSE if in a programming mode, else return RESPONSE.
Replace %
with %%
in TEXT to avoid format string errors calling message
.
Remove the end of line character at the very end of a string if present.
Use customize-group
| robby
to see the various customization options. Robby will validate values as appropriate, for example robby-chat-temperature
values must be between 0.0
and 2.0
.
Customize the robby-chat-api
variable to use a different OpenAI compatible provider. By default it is set to https://api.openai.com/v1/chat/completions
.
The robby-chat-api
customization group species the OpenAI API options. Here are
a few important ones:
robby-chat-api
- customization group with options to pass to the Chat API.
robby-chat-model
- the model to use with the Chat API, for example, “gpt-4” or “gpt-3.5-turbo”.
robby-chat-max-tokens
- The maximum number of tokens to return in the response. The Robby default is
2000
, but you may want to increase this for longer responses or decrease to reduce token usage.You can also set values in the
robby-chat-api
group for the current Emacs session by invoking therobby-api-options
transient.
Using robby-chat
with conversation history:
Using robby-fix-code
with prefix arg to show diff preview before applying fix: