-
-
Notifications
You must be signed in to change notification settings - Fork 362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support pending/loading message while waiting for response #821
Conversation
Thank you so much for opening this! When it's ready for review, please add "Fixes #176" to the description, so that we can close this long-running enhancement request. |
I've done whatever I want to for now but definitely a bit needs work for the fontend part. I would need help with that. Would you like me a mark it ready for review first before you guys take a look or would you guys want to provide some comments for me to fix stuff and only mark it when it is at a more ready state? |
@michaelchia Thank you for this contribution! Feel free to mark it as ready for review if you're seeking reviews. I'll have time to read through this on Monday. |
Maybe the frontend part should be contributed to https://github.com/jupyterlab/jupyter-chat and reused here? |
@michaelchia Thank you for this PR! This would be a wonderful addition to Jupyter AI. I circled this PR around our team, and we are in alignment with the UX you chose. I will review this tomorrow as soon as I get an opportunity and fast-track this PR for merge & release. Apologies for the delay in my review; our team is mostly out-of-office and I am currently cutting a new Jupyter AI release. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@michaelchia Awesome work! I love the context manager implementation you built in the Python backend. I left a few points of feedback concerning the frontend implementation below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@michaelchia Thank you for working on this so quickly! I have a few more minor callouts and this PR should be good-to-go.
@michaelchia I left a couple more points of feedback above. No rush at all, just writing this as a reminder in case you missed my update on Friday. 😄 |
for more information, see https://pre-commit.ci
Co-authored-by: david qiu <david@qiu.dev>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@michaelchia Awesome work, thank you very much for contributing and addressing my review feedback! I pushed a couple of commits that 1) simplify the rendered element and 2) match the font size of Generating response...
to other Jupyternaut messages. Tested locally and everything is working as expected.
Really excited to include this in the next release! 🎉
…b#821) * support pending message draft * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * styling + pending message for /fix * change default pending message * remove persona groups * inline styling * single timestamp * use message id as component key Co-authored-by: david qiu <david@qiu.dev> * fix conditional useEffect * prefer MUI Typography in PendingMessageElement to match font size * merge 2 outer div elements into 1 --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: david qiu <david@qiu.dev>
Description
Add support for a pending or loading message to be displayed to let the user know that the request is being processed. Provides updates for the progress on multi-step pipelines.
Fixes #176.
Demo
Screen.Recording.2024-06-07.at.6.46.31.PM.mov
Features
Usage
This first draft contains some example usage in a few default chat handlers. I would expect the team to modify the message to their liking.
To display a pending message, you can use the
BaseChatHandler.pending(text: str, ellipsis: bool = True) -> PendingMessage
context manager method to start and remove the pending message.This was used in default to provide a 'Thinking...' message
https://github.com/michaelchia/jupyter-ai/blob/d499edaab60e60192712ab96e4331c96e07deab1/packages/jupyter-ai/jupyter_ai/chat_handlers/default.py#L61-L65
Similarly in /ask
https://github.com/michaelchia/jupyter-ai/blob/d499edaab60e60192712ab96e4331c96e07deab1/packages/jupyter-ai/jupyter_ai/chat_handlers/ask.py#L75-L78
Used in /learn to replace the static reply message
https://github.com/michaelchia/jupyter-ai/blob/d499edaab60e60192712ab96e4331c96e07deab1/packages/jupyter-ai/jupyter_ai/chat_handlers/learn.py#L154-L167
Instead of the context manager, users can manually start the pending message with
BaseChatHandler.start_pending(text: str, ellipsis: bool = True) -> PendingMessage
and close it by callingBaseChatHandler.close_pending(message: PendingMessage) -> None
.Implemenation summary
PendingMessage
andClosePendingMessage
.pending
start_pending
andclose_pending
where added toBaseChatHandler
PendingMessages
element added just belowChatMessages
PendingMessages
element implemented in compontents/pending-messages.tsx as a modified version of compontents/chat-messages.tsxNext steps
If this idea is accepted by the team I would need some help cleaning up so it adheres to the product standards.
AgentChatMessage
so that I could reuseChatMessageHeader
Let me know what you guys think and if there is anything you would like me to add.