Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Context-aware prompt execution #331

Open
JasonWeill opened this issue Aug 10, 2023 · 0 comments
Open

Context-aware prompt execution #331

JasonWeill opened this issue Aug 10, 2023 · 0 comments

Comments

@JasonWeill
Copy link
Collaborator

Problem

When a user asks a question, only the prompt provided, plus any other data provided by the user, gets sent to the model. For example, the user can select data and send it to the chat UI's language model with their message. In addition, the /ask command augments a question using data from previous /learn commands. Data from the workbook itself is not included.

Proposed Solution

If the user so desires, augment users' prompts with contextual data from the current notebook. For example, if a user asks, "What is in column 2 of the dataframe?", and the preceding cell contains a Pandas dataframe, augment the prompt with information about the dataframe.

Additional context

As with all Jupyter AI features that send data to a large language model, users should explicitly opt in to having additional context from their notebook sent to language models with their prompts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant