-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BadRequestError: 400 An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response messages: call_hSmZB4G8cu3xYSWU6swBuOMo #6621
Comments
To resolve the "BadRequestError: 400 An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'" error in your LangChain code, ensure that every tool call made by the assistant is followed by a corresponding tool message that responds to each Here are the steps to ensure this:
Here is a relevant snippet from the code to help you understand how to handle tool calls: const responseMessageWithFunctions = new AIMessage({
content: chatGenerationContent.split("<function_calls>")[0],
additional_kwargs: {
tool_calls: invocations.map((toolInvocation, i) => {
const calledTool = tools.find(
(tool) => tool.function.name === toolInvocation.tool_name
);
if (calledTool === undefined) {
throw new Error(
`Called tool "${toolCall}" did not match an existing tool.`
);
}
return {
id: i.toString(),
type: "function",
function: {
name: toolInvocation.tool_name,
arguments: JSON.stringify(
fixArrayXMLParameters(
calledTool.function.parameters as JsonSchema7ObjectType,
toolInvocation.parameters
)
),
},
};
}),
},
}); Ensure that the Additionally, the export function convertToConverseMessages(messages: BaseMessage[]): {
converseMessages: BedrockMessage[];
converseSystem: BedrockSystemContentBlock[];
} {
const converseSystem: BedrockSystemContentBlock[] = messages
.filter((msg) => msg._getType() === "system")
.map((msg) => {
if (typeof msg.content === "string") {
return { text: msg.content };
} else if (msg.content.length === 1 && msg.content[0].type === "text") {
return { text: msg.content[0].text };
}
throw new Error(
"System message content must be either a string, or a content array containing a single text object."
);
});
const converseMessages: BedrockMessage[] = messages
.filter((msg) => msg._getType() !== "system")
.map((msg) => {
if (msg._getType() === "ai") {
const castMsg = msg as AIMessage;
const assistantMsg: BedrockMessage = {
role: "assistant",
content: [],
};
if (castMsg.tool_calls && castMsg.tool_calls.length) {
assistantMsg.content = castMsg.tool_calls.map((tc) => ({
toolUse: {
toolUseId: tc.id,
name: tc.name,
input: tc.args,
},
}));
}
if (typeof castMsg.content === "string" && castMsg.content !== "") {
assistantMsg.content?.push({
text: castMsg.content,
});
} else if (Array.isArray(castMsg.content)) {
const contentBlocks: ContentBlock[] = castMsg.content.map((block) => {
if (block.type === "text" && block.text !== "") {
return {
text: block.text,
};
} else {
const blockValues = Object.fromEntries(
Object.values(block).filter(([key]) => key !== "type")
);
throw new Error(
`Unsupported content block type: ${
block.type
} with content of ${JSON.stringify(blockValues, null, 2)}`
);
}
});
assistantMsg.content = [
...(assistantMsg.content ? assistantMsg.content : []),
...contentBlocks,
];
}
return assistantMsg;
} else if (msg._getType() === "human" || msg._getType() === "generic") {
if (typeof msg.content === "string" && msg.content !== "") {
return {
role: "user" as const,
content: [
{
text: msg.content,
},
],
};
} else if (Array.isArray(msg.content)) {
const contentBlocks: ContentBlock[] = msg.content.flatMap((block) => {
if (block.type === "image_url") {
const base64: string =
typeof block.image_url === "string"
? block.image_url
: block.image_url.url;
return extractImageInfo(base64);
} else if (block.type === "text") {
return {
text: block.text,
};
} else {
throw new Error(`Unsupported content block type: ${block.type}`);
}
});
return {
role: "user" as const,
content: contentBlocks,
};
} else {
throw new Error(
`Invalid message content: empty string. '${msg._getType()}' must contain non-empty content.`
);
}
} else if (msg._getType() === "tool") {
const castMsg = msg as ToolMessage;
if (typeof castMsg.content === "string") {
return {
// Tool use messages are always from the user
role: "user" as const,
content: [
{
toolResult: {
toolUseId: castMsg.tool_call_id,
content: [
{
text: castMsg.content,
},
],
},
},
],
};
} else {
return {
// Tool use messages are always from the user
role: "user" as const,
content: [
{
toolResult: {
toolUseId: castMsg.tool_call_id,
content: [
{
json: castMsg.content,
},
],
},
},
],
};
}
} else {
throw new Error(`Unsupported message type: ${msg._getType()}`);
}
});
// Combine consecutive user tool result messages into a single message
const combinedConverseMessages = converseMessages.reduce<BedrockMessage[]>(
(acc, curr) => {
const lastMessage = acc[acc.length - 1];
if (
lastMessage &&
lastMessage.role === "user" &&
lastMessage.content?.some((c) => "toolResult" in c) &&
curr.role === "user" &&
curr.content?.some((c) => "toolResult" in c)
) {
lastMessage.content = lastMessage.content.concat(curr.content);
} else {
acc.push(curr);
}
return acc;
},
[]
);
return { converseMessages: combinedConverseMessages, converseSystem };
} Ensure that your tool calls and responses are correctly formatted and included in the messages to avoid the error [1][2][3][4][5]. |
I have a similar issue using tool calling, RunnableWithMessageHistory, and a prompt template. I'm not using an agent executor however. I tried passing tool messages back to invoke but no dice. const userNameTool = tool(
async (data) => data,
{
name: "userName",
description: "called when the user names themselves",
schema: z.object({
name: z.string(),
message: z.string().describe("a message you will say to the user after learning their name"),
})
}
);
const llm = new ChatOpenAI({
model: "gpt-4o",
temperature: 0,
});
const prompt = ChatPromptTemplate.fromMessages([
['system', systemTemplate],
['placeholder', '{chat_history}'],
['human', "{input}"],
]);
const llmWithTools = llm.bindTools(tools);
const runnable = prompt.pipe(llmWithTools);
export const runnableWithMessageHistory = new RunnableWithMessageHistory({
runnable,
getMessageHistory: async (sessionId) => new RedisChatMessageHistory({
sessionId,
config: {
url: process.env.REDIS_URL,
},
}),
inputMessagesKey: "input",
historyMessagesKey: "chat_history",
});
const message = {
agent_name: "Bob",
caller_name: "Frankie",
caller_preferences: ['likes milk', 'eats candy'],
input: "my name is josh, actually",
};
const options = {
configurable: {
sessionId: 'hanky_panky',
},
};
const stream = await runnableWithMessageHistory.stream(message, options);
let gathered = undefined;
const toolMessages = [];
for await (const chunk of stream) {
gathered = gathered !== undefined ? concat(gathered, chunk) : chunk;
if (chunk.content) {
// send
}
if (chunk.response_metadata.finish_reason === "tool_calls") {
const toolCalls = gathered.tool_calls;
for (const toolCall of toolCalls) {
const tool = toolsByName[toolCall.name];
const toolMessage = await tool.invoke(toolCall);
// getting this error: "An assistant message with 'tool_calls' must be followed by tool toolMessages responding to each 'tool_call_id'.
toolMessages.push(toolMessage);
}
// await runnableWithMessageHistory.invoke(toolMessages, options);
// -> Error: Missing value for input variable `caller_name`
}
} |
Hi, @arkodeep3404. I'm Dosu, and I'm helping the LangChain JS team manage their backlog. I'm marking this issue as stale. Issue Summary:
Next Steps:
Thank you for your understanding and contribution! |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
arkodeepchatterjee@Arkodeeps-MacBook-Air chat % node index.js
New LangChain packages are available that more efficiently handle tool calling.
Please upgrade your packages to versions that set message tool calls. e.g.,
yarn add @langchain/anthropic
, yarn add @langchain/openai`, etc.node:internal/process/esm_loader:40
internalBinding('errors').triggerUncaughtException(
^
BadRequestError: 400 An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response messages: call_hSmZB4G8cu3xYSWU6swBuOMo
at APIError.generate (file:///Users/arkodeepchatterjee/Desktop/chat/node_modules/openai/error.mjs:41:20)
at OpenAI.makeStatusError (file:///Users/arkodeepchatterjee/Desktop/chat/node_modules/openai/core.mjs:268:25)
at OpenAI.makeRequest (file:///Users/arkodeepchatterjee/Desktop/chat/node_modules/openai/core.mjs:311:30)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async file:///Users/arkodeepchatterjee/Desktop/chat/node_modules/@langchain/openai/dist/chat_models.js:1302:29
at async RetryOperation._fn (/Users/arkodeepchatterjee/Desktop/chat/node_modules/p-retry/index.js:50:12) {
status: 400,
headers: {
'access-control-expose-headers': 'X-Request-ID',
'alt-svc': 'h3=":443"; ma=86400',
'cf-cache-status': 'DYNAMIC',
'cf-ray': '8b7e5bc1e97f2961-BOM',
connection: 'keep-alive',
'content-length': '325',
'content-type': 'application/json',
date: 'Fri, 23 Aug 2024 21:57:28 GMT',
'openai-organization': 'raheel-ioccbf',
'openai-processing-ms': '21',
'openai-version': '2020-10-01',
server: 'cloudflare',
'set-cookie': '__cf_bm=xVttIsHFgX4RJcqQIP17W4kjroBwaRb2sp_eZRnTnTU-1724450248-1.0.1.1-SErkhyNCqzUpJ1D1vMzrnjhjzQibfeclp03kei7Vcoy4KakiQ7U5ezHxJxp8vU54HqKYMZT6JxJIk__XqpAAbA; path=/; expires=Fri, 23-Aug-24 22:27:28 GMT; domain=.api.openai.com; HttpOnly; Secure; SameSite=None, _cfuvid=wcSZXMnokbW7KcXy2h26OdsDxUPE1H6OGLz4zeYNCa8-1724450248524-0.0.1.1-604800000; path=/; domain=.api.openai.com; HttpOnly; Secure; SameSite=None',
'strict-transport-security': 'max-age=15552000; includeSubDomains; preload',
'x-content-type-options': 'nosniff',
'x-ratelimit-limit-requests': '5000',
'x-ratelimit-limit-tokens': '2000000',
'x-ratelimit-remaining-requests': '4999',
'x-ratelimit-remaining-tokens': '1999765',
'x-ratelimit-reset-requests': '12ms',
'x-ratelimit-reset-tokens': '7ms',
'x-request-id': 'req_fdd5c472dfef8a7839ec5f00aaf4d4bd'
},
request_id: 'req_fdd5c472dfef8a7839ec5f00aaf4d4bd',
error: {
message: "An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response messages: call_hSmZB4G8cu3xYSWU6swBuOMo",
type: 'invalid_request_error',
param: 'messages.[19].role',
code: null
},
code: null,
param: 'messages.[19].role',
type: 'invalid_request_error',
attemptNumber: 1,
retriesLeft: 6
}
Node.js v20.9.0
arkodeepchatterjee@Arkodeeps-MacBook-Air chat %
Description
message history + tool calling doesn't work because ToolMessage object doesn't exist in history stack
System Info
{
"name": "chat",
"version": "1.0.0",
"main": "index.js",
"type": "module",
"scripts": {
"test": "echo "Error: no test specified" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"description": "",
"dependencies": {
"@aws-sdk/client-dynamodb": "^3.637.0",
"@langchain/community": "^0.2.31",
"@langchain/core": "^0.2.28",
"@langchain/openai": "^0.2.7",
"dotenv": "^16.4.5",
"langchain": "^0.2.16",
"zod": "^3.23.8"
}
}
The text was updated successfully, but these errors were encountered: