Replies: 1 comment
-
Hey! Thanks for sharing your question here. I'm sorry for not getting back to you sooner. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I want to embed a book but instead of answering my questions, I want the LLM to ask me a question based on the embedded book. Is it possible? I think I'll need to prompt this somewhere but there is no provision to set a prompt for a vectorstoreagent. Can anyone guide me on this? Thanks!
Beta Was this translation helpful? Give feedback.
All reactions