-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(anthropic): langchain-anthropic support custom create anthropic client #6615
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Skipped Deployment
|
Ah very nice! |
Thank you for this! |
This is live in |
Hey, I'm a little nooby. |
Hey @rossanodr, do these docs help? https://js.langchain.com/v0.2/docs/integrations/chat/anthropic/#custom-clients |
Thanks, but I can't seem to find a way to make it work with Vertex. I'm still getting the error: |
The model name should be similar to Additionally, you should first create an AnthropicVertex client and test it without using langchain to determine if the error is caused by langchain. For your reference, I also encountered many authentication-related errors when I initially used AnthropicVertex. const client = new AnthropicVertex({
region: import.meta.env.VITE_VERTEX_ANTROPIC_REGION,
projectId: import.meta.env.VITE_VERTEX_ANTROPIC_PROJECTID,
googleAuth: new GoogleAuth({
credentials: {
client_email: import.meta.env
.VITE_VERTEX_ANTROPIC_GOOGLE_SA_CLIENT_EMAIL!,
private_key: import.meta.env
.VITE_VERTEX_ANTROPIC_GOOGLE_SA_PRIVATE_KEY!,
},
scopes: ['https://www.googleapis.com/auth/cloud-platform'],
}),
})
const chat = new ChatAnthropic({
apiKey: 'test',
model: 'claude-3-5-sonnet@20240620',
createClient: (() => client) as any,
})
const response = await chat.invoke([['human', 'Hello!']])
console.log(response) Strange, I just noticed that I've been passing a fake apiKey to ChatAnthropic all along. I will submit another PR to handle ignoring the check for API key existence when createClient exists. |
I am having authentication issues [llm/error] [1:llm |
Similarly, for testing, I suggest using only NodeJS instead of Edge runtime, as they have some annoying inconsistencies. You can try to complete basic examples in NodeJS first, then consider Edge Runtime, otherwise you may run into issues where you don't know where the error is coming from (in fact, Google cloud services' authentication method itself is very complex, and compared to the original LLM provider that only needs an API_KEY, it's downright nasty). Additionally, I've also submitted a PR for @anthropic-ai/vertex-sdk to support edge runtime (in my scenario, it's cloudflare workers), but it hasn't been merged yet. Reference: anthropics/anthropic-sdk-typescript#509 |
Unfortunately, I can't opt for the NodeJS environment at the moment. I hope it will be possible to use it in the edge environment soon. |
…client (langchain-ai#6615) Co-authored-by: jacoblee93 <jacoblee93@gmail.com>
Support for custom Anthropic clients using @langchain/anthropic, which will allow the use of @anthropic-ai/vertex-sdk and @anthropic-ai/bedrock-sdk, and also support custom Anthropic, such as supporting a modified version of the edge runtime.
Fixes #5196