Replies: 1 comment 2 replies
-
Hey @jlchereau! Great to see you diving into the depths of LangChain again. Looking forward to unpacking this issue with you 🚀 Based on your description, it seems like the issue lies in the interaction between the To address this, you could modify the def create_history_aware_retriever(
llm: LanguageModelLike,
retriever: RetrieverLike,
prompt: BasePromptTemplate,
) -> RetrieverOutputLike:
"""Modified to return both documents and rephrased question when there's chat history."""
if "input" not in prompt.input_variables:
raise ValueError(
"Expected `input` to be a prompt variable, "
f"but got {prompt.input_variables}"
)
def retrieve_and_rephrase(input_data):
# Rephrase the question using the LLM and prompt
rephrased_question = prompt | llm | StrOutputParser()(input_data)
# Retrieve documents based on the rephrased question
documents = retriever(rephrased_question)
# Return both the rephrased question and the documents
return {"rephrased_question": rephrased_question, "documents": documents}
retrieve_documents: RetrieverOutputLike = RunnableBranch(
(
# Both empty string and empty list evaluate to False
lambda x: not x.get("chat_history", False),
# If no chat history, then we just pass input to retriever and return original question
(lambda x: {"rephrased_question": x["input"], "documents": retriever(x["input"])}),
),
# If chat history, then we rephrase and retrieve
retrieve_and_rephrase,
).with_config(run_name="chat_retriever_chain")
return retrieve_documents This modification ensures that when there is chat history, the question is first rephrased using the provided language model and prompt. The rephrased question is then used to retrieve documents. Both the rephrased question and the retrieved documents are returned. When there is no chat history, the original question is passed directly to the retriever, and both the original question and the retrieved documents are returned. For Please note that this is a suggested modification and might not perfectly fit your use case. You might need to adjust it according to your specific requirements. Also, remember to thoroughly test any changes to ensure they work as expected. Regarding the current implementation of I hope this helps! If you have any more questions or need further clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I am currently evaluating various frameworks for a RAG project with small prototypes. Code available at https://github.com/digillia/Digillia-Colab/blob/main/tools/langchain.ipynb. Click the Google Colab button to run in Colab.
I am using
create_history_aware_retriever
,create_stuff_documents_chain
andcreate_retrieval_chain
as described at https://python.langchain.com/docs/use_cases/question_answering/chat_history/#chain-with-chat-history.The issue I am facing is when I ask a follow-up question like Where?, it does not use the history to understand the question and replies that it cannot answer the question.
I have run the code with set_debug(True) which shows a possible design issue in the way
create_history_aware_retriever
,create_stuff_documents_chain
andcreate_retrieval_chain
are supposed to work together:When I first ask When was Napoleon 1st born?, there is no history, the question is not rephrased and the relevant documents are pulled by
history_aware_retriever
. The question is answered byquestion_answer_chain
based on these documents added to the context.Then when I ask **Where?",
history_aware_retriever
properly rephrases the question to Where was Napoleon 1st born? considering the chat history and pulls the relevant documents. But thenquestion_answer_chain
tries to answer the original question Where? based on these documents added to the context, but fails.I have not found a way to fix this with the LangChain APIs, so this might be a design issue. Please let me know your opinion.
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
Beta Was this translation helpful? Give feedback.
All reactions