-
Notifications
You must be signed in to change notification settings - Fork 15.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue: <Please write a comprehensive title after the 'Issue: ' prefix> #13349
Comments
🤖 Hello @Saulti23! I'm here to assist you with your issue. I'm a bot designed to help solve bugs and answer your questions about the LangChain repository. While we're waiting for a human maintainer, I'll be working on your issue. I'll provide you with a complete response shortly. Let's get started! |
🤖 Based on the error message you provided, it seems that the Here is the relevant part of your code: response = conversation({"question": question}) To fix this issue, you should include the response = conversation({"question": question, "session_id": str(session_id)}) This will ensure that the Please note that the I hope this helps! If you have any other questions or if the issue persists, please let me know. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
I already tried that and it didn't work I got this error: ValueError: One input key expected got ['session_id', 'question'] |
🤖 Based on the information you've provided and the context from the LangChain repository, it seems like the In your case, you're passing a dictionary with keys To resolve this issue, you should modify your code to pass the history = "Previous conversation: " + str(session_id)
input = question
response = conversation(history, input) In this example, As for the self.metadata["session_id"] = str(session_id) You can add this line of code in the I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
now Im getting this error which is contradicting you're previous solution and I feel like the error is looping |
🤖 Based on the issues I found in the LangChain repository, it seems like the In the DOC: Passing Context to Conversational Chain issue, it's suggested that you can pass context to the from langchain.chains import (
StuffDocumentsChain, LLMChain, ConversationalRetrievalChain
)
from langchain.prompts import PromptTemplate
from langchain.llms import OpenAI
combine_docs_chain = StuffDocumentsChain(...)
vectorstore = ...
retriever = vectorstore.as_retriever()
template = (
"Combine the chat history and follow up question into "
"a standalone question. Chat History: {chat_history}"
"Follow up question: {question}"
)
prompt = PromptTemplate.from_template(template)
llm = OpenAI()
question_generator_chain = LLMChain(llm=llm, prompt=prompt)
chain = ConversationalRetrievalChain(
combine_docs_chain=combine_docs_chain,
retriever=retriever,
question_generator=question_generator_chain,
) In this example, the In your case, you might need to adjust the keys you're passing to the As for the expected format and type of the I hope this helps! If you have any further questions or if something is unclear, please let me know. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Hi, @Saulti23, I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, you encountered a ValueError for missing input keys, specifically 'session_id', when using the code. Dosubot provided some suggestions, and it seems the issue was eventually resolved by adjusting the keys passed to the 'conversation' function to match the expected input keys of the 'LLMChain' instance. Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you! |
Issue you'd like to raise.
I have my code and for the first section of it I am inputting a memory key based in the session_id as a UUID
I am receiving the following error: Any idea what the issue is?
here is the input:
session_id = uuid.uuid4() print(chat_csv(session_id,None,'Hi how are you?'))
Here is the code:
Suggestion:
No response
The text was updated successfully, but these errors were encountered: