How to answer prompt with the previous questions context, using langchain. #15159
-
what I want is I wanna upload a document to my fine-tuned llm using langchain, and ask questions related to it. I have used ConversationBufferWindowMemory and ConversationalRetrievalChain, but was unable to do it. It answers only if the question is specific but not when its not specified one. For eg: If the file has content about 2 people(say Barack Obama and bill gates), if the first question is : who is bill gates. model: [answers] second question: where was he born? model: [model answers - Barack Obama's DOB]. what I want is at second time it should answer based on the previous questions context. I did set the k value to 10 in ConversationBufferWindowMemory still unable to do it. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
🤖 Hi @Opperessor, I'm Dosu, a friendly bot here to help you out while we're waiting for a human maintainer. I'll assist you with bug resolutions, further questions, and even guide you to contribute. Let's crack this issue together! Based on the issues and solutions I found in the LangChain repository, it seems like you might need to modify the Here's how you can do it: prompt_template = """Use the following pieces of context and previous questions and answers to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.
{context}
Previous Q&A: {previous_qa}
Question: {question}
Helpful Answer:"""
PROMPT = PromptTemplate(
template=prompt_template, input_variables=["context", "previous_qa", "question"]
) In this modified template, Regarding the I hope this helps! If you have any more questions or need further clarification, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
🤖
Hi @Opperessor,
I'm Dosu, a friendly bot here to help you out while we're waiting for a human maintainer. I'll assist you with bug resolutions, further questions, and even guide you to contribute. Let's crack this issue together!
Based on the issues and solutions I found in the LangChain repository, it seems like you might need to modify the
PROMPT
template to include previous questions and their answers in the context. This can be done by adding a new input variable, sayprevious_qa
, to theinput_variables
list and including it in theprompt_template
.Here's how you can do it: