Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a "continuous" chat mode for Talk bot commands to use with chatbots #9564

Closed
e-caste opened this issue May 18, 2023 · 1 comment
Closed

Comments

@e-caste
Copy link

e-caste commented May 18, 2023

How to use GitHub

  • Please use the 👍 reaction to show that you are interested into the same feature.
  • Please don't comment if you have no relevant information to add. It's just extra noise for everyone subscribed to this issue.
  • Subscribe to receive notifications on status change and new comments.

Is your feature request related to a problem? Please describe.
I am currently experimenting with self-hosting a quantized version of the OpenAssistant LLM with the API provided by https://github.com/oobabooga/text-generation-webui. To achieve full self-hosting chat functionality, I have developed a simple Python script to send API requests when a Nextcloud Talk command is invoked in the chat, which passes the rest of the message as a prompt to the LLM. This can of course be extended to other APIs, such as those offered by HuggingFace and OpenAI.

Describe the solution you'd like
I'd like to be able to avoid typing the command every time I want to send a message to the LLM. My quick proposal:

  • /<command> enable: enters the mode where everything that is typed in the chat after that, is processed by the command <command>
  • /<command> disable: exits the "chat mode" and stops sending the contents of the messages to the command script.

Describe alternatives you've considered
Now I have to type the command in every message. It works, but it's not optimal and would not work for inexperienced users.

Additional context
I'd also love if Talk supported MarkDown, it would be very useful when asking LLMs to produce code or formatted text (not opening another issue as there is #1027 already) 🤖

@nickvergessen
Copy link
Member

nickvergessen commented May 18, 2023

#9458 is what will cover this
So it's a duplicate of #1879

@nickvergessen nickvergessen closed this as not planned Won't fix, can't repro, duplicate, stale May 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants