Throws Error: Missing value for input variable question
on subsequent chatting
#5769
Replies: 2 comments 6 replies
-
Hey there, @ShisuiMadara! I'm here to help you out. If you need assistance with bugs, questions, or want to contribute, I've got your back. Let's tackle this issue together! To resolve the error "Missing value for input variable
Here’s an updated version of your import { ChatOpenAI, OpenAI } from "@langchain/openai";
import { PineconeStore } from "@langchain/pinecone";
import { ConversationChain, loadQAChain } from 'langchain/chains';
import { PromptTemplate } from "@langchain/core/prompts";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
import { createStuffDocumentsChain } from "langchain/chains/combine_documents";
import { createHistoryAwareRetriever } from "langchain/chains/history_aware_retriever";
import { createRetrievalChain } from "langchain/chains/retrieval";
const condenseQuestionTemplate = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question, in its original language.
Chat History:
{chat_history}
Follow Up Input: {question}
Standalone question:`;
const CONDENSE_QUESTION_PROMPT = PromptTemplate.fromTemplate(condenseQuestionTemplate);
const answerTemplate = `Answer the question based only on the following context:
{context}
Question: {question}`;
const ANSWER_PROMPT = PromptTemplate.fromTemplate(answerTemplate);
const makeChain = async (vectorstore: PineconeStore) => {
const llm = new ChatOpenAI();
const contextualizeQPrompt = ChatPromptTemplate.fromMessages([
["system", condenseQuestionTemplate],
new MessagesPlaceholder("chat_history"),
["human", "{question}"],
]);
const historyAwareRetriever = await createHistoryAwareRetriever({
llm,
retriever: vectorstore.asRetriever(),
rephrasePrompt: contextualizeQPrompt,
});
const qaSystemPrompt = `
You are an assistant for question-answering tasks. Use
the following pieces of retrieved context to answer the
question. If you don't know the answer, just say that you
don't know.
\n\n
{context}`;
const qaPrompt = ChatPromptTemplate.fromMessages([
["system", qaSystemPrompt],
new MessagesPlaceholder("chat_history"),
["human", "{question}"],
]);
const docChain = await createStuffDocumentsChain({
llm,
prompt: qaPrompt,
});
const ragChain = await createRetrievalChain({
retriever: historyAwareRetriever,
combineDocsChain: docChain,
});
return ragChain;
};
export default makeChain; When invoking the chain, ensure you provide the const chain = await makeChain(vectorstore);
const res = await chain.invoke({
question: "What is the meaning of life?",
chat_history: "Previous conversation history here",
});
console.log(res); This should resolve the error by ensuring that the |
Beta Was this translation helpful? Give feedback.
-
Hi @ShisuiMadara, may I know if the issue is resolved?? |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to make a conversational bot. Which answers questions based on the given document. For this I created a stuffDocumentChain and then a retrieval change. I can run the code for one input from the UI. but for the future inputs, it throws the error. Although it receives the question from the frontend.
System Info
"@langchain/openai": "^0.1.3",
"@langchain/pinecone": "^0.0.7",
"langchain": "0.2.5",
Beta Was this translation helpful? Give feedback.
All reactions