Skip to content

stevemolitor/robby

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robby: Extensible Emacs Interface to OpenAI

./images/robby.png

About

Robby provides an extensible Emacs interface to OpenAI.

Robby provides a set of functions to generate prompts and act on AI responses, and a macro and other utilities to assemble these into reusable AI commands. Robby uses these facilities to build core commands to operate on regions and such, and task-specific commands to do things like fixing code, generating comments, summarizing text, etc. Robby also provides an interactive builder to create custom commands without having to write code.

Table of Contents

Requirements

  • Emacs 29.1 or higher

Robby will use curl to make HTTP requests if available, otherwise, it will fall back to url-retrieve. Streaming responses are only supported with curl.

Installation

Emacs 29

To install robby in Emacs 29 versions, first install via package-vc-install and then turn on robby-mode:

(if (not (package-installed-p 'robby))
    (package-vc-install "https://github.com/stevemolitor/robby"))

(robby-mode)

;; (optional) bind a prefix key to the robby-command-map, default is "C-c C-r":
(global-set-key (kbd "s-r") robby-command-map)

;; (optional) show robby view window in a side window on the right side:
(add-to-list 'display-buffer-alist
         '("\\*robby\\*"
           (display-buffer-in-side-window)
           (side . right)
           (window-width . 0.4)))

Emacs 30

To install robby in Emacs 30, first install via use-package with the :vc option, and then turn on robby-mode:

(use-package robby
  :vc (:url "https://github.com/stevemolitor/robby"
       :branch "main"
       :rev :newest)
  :bind-keymap
  ;; (optional) bind a prefix key to the robby-command-map, default is "C-c C-r":
  ("s-r" . robby-command-map)
  :config
  (robby-mode)

  ;; (optional) show robby view window in a side window on the right side:
  (add-to-list 'display-buffer-alist
   '("\\*robby\\*"
     (display-buffer-in-side-window)
     (side . right)
     (window-width . 0.4))))

Set API Key

You need an OpenAI API key to use robby. You can set the variable robby-openai-api-key to the key, or set it to a function that returns the key.

;; not very secure, see below re using auth sources for a better approach:
(setq robby-openai-api-key "my-key")

By default robby-openai-api-key is set to the function robby-get-api-key-from-auth-source which reads from your Auth Source file, ~/.authinfo or ~/.netrc. You need to add a line to that file that looks like this:

machine api.openai.com login API key password OPENAI_API_KEY

For instructions on using keys securely within Emacs using GPG and Auth Sources see the “Mastering Emacs” article Keeping Secrets in Emacs with GnuPG and Auth Sources. Robby will decrypt and read the API key from the encrypted ~/.authinfo.gpg if you have set that up. This is what I do.

Usage

Core Commands

Robby comes with the following generic built-in commands:

robby-message

Ask for a prompt in the minibuffer, send the prompt to OpenAI, and display the response in the minibuffer. Maintain conversational history of previous prompts and responses, up to robby-max-history prompt/response pairs.

./images/message-prompt.png

./images/message-response.png

robby-chat

Query AI from the region, and respond in a read-only markdown view window. Maintain conversational history of previous prompts and responses. ./images/view-prompt.png

./images/view-response.png

You can refine the response by typing v:

./images/view-prompt-2.png

./images/view-response-2.png

robby-chat-from-region

Like robby-chat, but reads prompt from the current region, or the entire buffer if no active region. You can supply an optional prompt prefix from the minibuffer, to provide extra context or instructions.

robby-prepend-region

Query AI from the region, prefix the selection region with the response, or insert at point if no selected region. If no selected region read prompt from current buffer. You can supply an optional prompt prefix from the minibuffer, to provide extra context or instructions.

robby-append-region

Query AI from the region, prefix region with the response, or insert at point if no selected region. If no selected region read prompt from current buffer. You can supply an optional prompt prefix from the minibuffer.

robby-replace-region

Query AI from region, prefix region with the response. If no selected region read prompt from current buffer. You can supply an optional prompt prefix from the minibuffer, to provide extra context or instructions.

If a prefix argument is supplied, robby will display the changes in a diff buffer and ask for confirmation before applying.

Commands for Specific Tasks

Robby also includes a handful of example commands for specific tasks. You can use these as inspiration when creating your commands.

See robby-commands.el for the definitions of robby’s commands. You may want to copy and paste and then adjust the prompts to suit your needs or use them as inspiration for your commands.

Here is the list of example commands:

robby-describe-code

Describe code in the selected region, show description in robby view window.

robby-fix-code

Fix code in the selected region.

Preview changes in a diff buffer when invoked with a prefix argument.

robby-git-commit-message

Generate git commit message title from staged changes.

robby-add-comment

Add a comment for the code in the selected region or buffer. Preview changes in a diff buffer when invoked with a prefix argument.

robby-write-tests

Write some tests for the code in the region, and append them to the region.

robby-summarize

Summarize the text in the selected region or entire buffer if no selected region, show description in robby view window.

robby-proof-read

Proof read the text in the selected region.

Preview changes in a diff buffer when invoked with a prefix argument.

Commands Transient

M-x robby-commands will display a transient menu for executing the robby core and task-specific commands:

./images/commands-transient.png

To see command options (currently just one) set the transient level to 4:

./images/advanced-commands-options.png

With the d option selected, before fixing code or proofreading text, robby will display the changes in a diff buffer and ask for confirmation before applying.

Conversation History

Robby passes the conversation history of previous messages to OpenAI. Conversation history is local to the output buffer of the command. For most commands this is the current buffer, but for robby-chat and robby-chat-from-region it is the *robby* robby view output buffer.

You can clear the history for a buffer with the robby-clear-history command.

Note that commands can opt out of conversation history by setting the historyp option to nil.

The robby-max-history customization variable specifies the maximum number of previous prompt/response pairs to keep in the conversation history. Its default is 2. Increasing this value will pass more history context to OpenAI, at the cost of using more tokens. Setting it to 0 to turn conversation history off.

Logging

Set robby-logging to t to enable logging. Robby will log OpenAI requests and responses in the *robby-log* buffer.

Building Custom Commands with Robby Builder

Running robby-builder (C-c C-r b)will bring up a transient menu to build and execute robby commands interactively. You can use this to tune your prompt, API options, and such. When you are satisfied with the result you can save the command via robby-insert-last-command:

./images/builder.png

To see advanced options or set the transient level to 6:

./images/advanced-builder-options.png

Press A in the builder to see a menu of chat API options. For example, you can select which chat model to use. The first time you customize the model from the builder robby will fetch the list of models available to your account. Press m to pick a different model:

./images/builder-api-options.png

You can experiment with the various chat API options to tune a particular command. For example, for certain commands, you may want to set the robby-chat-tempature to 0 to produce more deterministic results. For other commands, you may want to choose a different model, higher max tokens etc. See the OpenAI Chat API Documentation for details on the various options. When you save a command via robby-insert-last-command the API options you used will be persisted with the command definition.

When you select m to select a model, robby will fetch the models available to you from OpenAI:

./images/api-options-models.png

You can also access the API options transient directly via M-x robby-api-options, or by customizing the robby-chat customization group.

Defining Custom Commands

Defining Simple Commands

Use the robby-define-command macro to define custom robby commands. Here is a simple example:

(require 'cl-macs)

(robby-define-command
 what-is-emacs
 "Tell me what emacs is. Print response in minbuffer"
 :prompt "What is emacs?"
 :action (cl-function (lambda (&key text &allow-other-keys)
                        (message text)))
 :never-stream-p t)

The first argument is the name of the interactive command, the second is the docstring.

The :prompt can either be a string or a function. If it’s a string, that string is sent to OpenAI as the prompt. If it’s a function, the result of calling that function is used as the prompt. The :action function does something with the response. We turn off streaming with :never-stream-p t. (We’ll talk about streaming responses below.)

Here’s a command that reads the prompt from the minibuffer, and responds in the minibuffer. It’s a slightly simplified version of robby-message:

(cl-defun get-prompt-from-minibuffer (&rest)
  "Get Robby prompt from minibuffer."
  (read-string "Request for AI overlords: "))

(cl-defun respond-with-message (&key text &allow-other-keys)
  "Print TEXT in minibuffer."
  (message text))

(robby-define-command
 ask-ai
 "Read prompt from minibuffer, print response to minibuffer "
 :prompt #'get-prompt-from-minibuffer
 :action #'respond-with-message
 :never-stream-p t)

Handing Streaming Responses in Custom Commands

To handle streaming responses our action function needs to handle receiving the response in chunks. Here is an example of a command that streams the response after the selected region, or at the point if no region is selected:

(cl-defun stream-after-region (&key text beg end chars-processed &allow-other-keys)
  "Stream response after region."
  (goto-char (+ end chars-processed))
  (insert text))

(robby-define-command
 append-response
 "Read prompt from minibuffer, append response to selected region, or point if no region."
 :prompt #'get-prompt-from-minibuffer
 :action #'stream-after-region)

With streaming responses, text is the current chunk. The action will be called repeatedly for each chunk received.

The beg and end arguments are the start and end of the region when the command was invoked, or the point if no selected region. Note that robby commands are asynchronous, so the region or point may have changed by the time the response comes back.

The :chars-processed argument records the number of characters previously received and processed, so you can calculate where to put the next chunk.

Grounding Functions

You can use a grounding function to process the text response after receiving it from OpenAI, but before sending it to the action. This can help clean up responses before displaying them to the user. For example, robby provides a format-message-text grounding function to escape any % characters to avoid messing up the message function:

(defun robby-format-message-text (response)
  "Replace % with %% in TEXT to avoid format string errors calling `message."
  (replace-regexp-in-string "%" "%%" response))

(robby-define-command
 ask-ai
 "Read prompt from minibuffer, print response to minibuffer "
 :prompt #'get-prompt-from-minibuffer
 :action #'respond-with-message
 :never-stream-p t
 :grounding-fns #'robby-format-message-text)

The :grounding-fns option takes either a list of grounding functions that will be executed in order, or a single grounding function as shown above.

A More Involved Example

The prompt or action options can do more than just operate on the selected region. For example, the robby-git-commit-message function invokes a shell command to get the list of staged changes in a git repository and generates a one-line git commit message:

(cl-defun robby-get-prompt-from-git-diff (&key prompt-prefix &allow-other-keys)
  (let* ((dir (locate-dominating-file default-directory ".git"))
         (diff (shell-command-to-string (format "cd %s && git diff --staged" dir))))
    (format "%s\n%s" prompt-prefix diff)))

(robby-define-command
 robby-git-commit-message
 "Generate git commit message title."
 :prompt
 #'robby-get-prompt-from-git-diff
 :action
 #'robby-prepend-response-to-region
 :prompt-args
 '(:prompt-prefix "For the following git diff, provide a concise and precise commit title capturing the essence of the changes in less than 50 characters.\n")
 :grounding-fns #'robby-remove-quotes
 :never-stream-p t)

Custom Command Options

You pass custom OpenAI API options in the :options property list when defining a custom command. For example this command sets the OpenAI max_tokens property to 2000, just for this command:

(robby-define-command
 robby-describe-code
 "Describe code in the selected region, show description in help window."
 :historyp nil
 :prompt #'robby-get-prompt-from-region
 :prompt-args '(:prompt-prefix "Describe the following code: ")
 :action #'robby-respond-with-robby-chat
 :api-options '(:max-tokens 2000))

Here is the complete list of command options:

prompt

If a function, the command will call it with the interactive prefix argument to obtain the prompt. If a string, it grabs the prompt from the region or the entire buffer context if no region, and prefixes the region text with the PROMPT string to build the prompt.

Prompt functions take the following keyword arguments:

  • arg - The prefix arg, if any, for the invoked command.
  • prompt-prefix - String to prepend to the prompt.
  • prompt-suffix - String to append to the prompt.
  • prompt-buffer - The buffer to get prompt from. Usually, this is the current buffer, but commands can specify other buffers.
  • never-ask-p - Prefix functions like robby-get-prompt-from-region ask the user for a prompt prefix before executing the command. Pass never-ask-p t to turn that behavior off.

Before sending the prompt to AI robby will format the prompt using format-spec with the following format specs:

  • e - current file extension
  • f - current buffer file name
  • l - user login name
  • n - user full name

You can use these format specifiers in your own prompts to provide extra context. For example here is a simplified version of robby-add-comment that passes the file extension by using the %e format spec:

(robby-define-command
 add-comment
 "Add a comment for the code in the selected region or buffer."
 :historyp nil
 :never-stream-p t
 :prompt #'robby-get-prompt-from-region
 :prompt-args '(:prompt-prefix
                "The file extension for this code is \"%e\". First, determine the programming language based on the file extension. Then, write a documentation comment for the code with the appropriate comment delimeters for the programming language. Here is the code:
")
 :action #'robby-prepend-response-to-region)

action

  • Type: Function.
  • Description: The function to invoke when the request is complete. The function is passed the response text and the selected region. Must be of the form ‘(TEXT BEG END)’.

    Action functions take the following keyword options:

    • arg - The prefix arg, if any, for the invoked command.
    • text - The response text received from OpenAI. For streaming responses, this will be the current chunk.
    • beg - The beginning of the response region, an integer. This tells action functions where to start inserting or replacing text.
    • end - The end position of the response region, an integer.
    • chars-processed - For streaming responses, the number of characters already processed. Actions can use chars-processed + beg to calculate where to insert the next chunk.
    • completep - For streaming responses, indicates if the response is complete. On the last chunk completep will be t.

api-options

  • type: Property list.
  • Description: Options to pass to the OpenAI API. These options are merged with the customization options specified in either the ‘robby-chat-api’ or ‘robby-completions-api’ customization group.

grounding-fns

  • Type: Not specified.
  • Description: Used to format the response from OpenAI before returning it. Only used if ‘NEVER-STREAM-P’ is true.

no-op-pattern

  • Type: Regular expression.
  • Description: If the response matches this pattern, do not perform the action. Useful with prompts that instruct OpenAI to respond with a certain message if there is nothing to do.

no-op-message

  • Type: String (Optional).
  • Description: The message to display when NO-OP-PATTERN matches.

historyp

  • Type: Boolean.
  • Description: Include conversation history in the OpenAI request if true.

never-stream-p

  • Type: Boolean.
  • Description: Stream response if true. Overrides the ‘robby-stream’ customization variable if present.

Built-in Prompt Functions

robby-get-prompt-from-minibuffer

Get Robby prompt from minibuffer.

robby--get-region-or-buffer-text

“Get robby prompt from buffer region. If no selected region return all text in buffer.”

robby-get-prompt-from-region

Get prompt from region, or entire buffer if no selected region.

If supplied PROMPT-PREFIX and/or PROMPT-PREFIX are prepended or appended to the buffer or region text to make the complete prompt.

If both PROMPT-PREFIX and PROMPT-SUFFIX are nil or not specified, prompt the user for a prompt prefix in the minibuffer.

Built-in Action Functions

robby-respond-with-message

Show TEXT in minibuffer message.

robby-prepend-response-to-region

Prepend AI response to region, or insert at point if no selected region.

robby-append-response-to-region

Append AI response to region, or insert at point if no selected region.

robby-replace-region-with-response

Replace region with AI response, or insert at point no selected region.

robby-respond-with-robby-chat

Show TEXT in robby-chat-mode buffer.

Built-in Grounding Functions

robby-extract-fenced-text

Extract the text between the first pair of fenced code blocks in RESPONSE.

robby-extract-fenced-text-in-prog-modes

Extract the text between the first pair of fenced code blocks in RESPONSE if in a programming mode, else return RESPONSE.

robby-format-message-text

Replace % with %% in TEXT to avoid format string errors calling message.

robby-remove-trailing-end-of-line

Remove the end of line character at the very end of a string if present.

Customization

Use customize-group | robby to see the various customization options. Robby will validate values as appropriate, for example robby-chat-temperature values must be between 0.0 and 2.0.

OpenAI URL

Customize the robby-chat-api variable to use a different OpenAI compatible provider. By default it is set to https://api.openai.com/v1/chat/completions.

API Options

The robby-chat-api customization group species the OpenAI API options. Here are a few important ones:

robby-chat-api
customization group with options to pass to the Chat API.
robby-chat-model
the model to use with the Chat API, for example, “gpt-4” or “gpt-3.5-turbo”.
robby-chat-max-tokens
The maximum number of tokens to return in the response. The Robby default is 2000, but you may want to increase this for longer responses or decrease to reduce token usage.

You can also set values in the robby-chat-api group for the current Emacs session by invoking the robby-api-options transient.

Animated Gifs

Using robby-chat with conversation history:

./images/robby-chat-video.gif

Using robby-fix-code with prefix arg to show diff preview before applying fix:

./images/fix-code.gif

About

Extensible Emacs interface to OpenAI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published