Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Huggingface tool-calling is not working #24430

Closed
5 tasks done
hadifar opened this issue Jul 19, 2024 · 1 comment · Fixed by #24456
Closed
5 tasks done

Huggingface tool-calling is not working #24430

hadifar opened this issue Jul 19, 2024 · 1 comment · Fixed by #24456
Assignees
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate

Comments

@hadifar
Copy link

hadifar commented Jul 19, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_huggingface import ChatHuggingFace
from langchain_huggingface import HuggingFacePipeline
from langchain_core.pydantic_v1 import BaseModel, Field
from langchain.pydantic_v1 import BaseModel, Field


class Calculator(BaseModel):
    """Multiply two integers together."""

    a: int = Field(..., description="First integer")
    b: int = Field(..., description="Second integer")


tools = [Calculator]

llm = HuggingFacePipeline.from_model_id(
    model_id="microsoft/Phi-3-mini-4k-instruct",
    task="text-generation",
    device_map="auto",
    pipeline_kwargs={
        "max_new_tokens": 1024,
        "do_sample": False,
        "repetition_penalty": 1.03,
    }
)
chat_model = ChatHuggingFace(llm=llm)

print(chat_model.invoke("How much is 3 multiplied by 12?"))

Error Message and Stack Trace (if applicable)

Here is the output:
content='<|user|>\nHow much is 3 multiplied by 12?<|end|>\n<|assistant|>\n To find the product of 3 and 12, you simply multiply the two numbers together:\n\n3 × 12 = 36\n\nSo, 3 multiplied by 12 equals 36.' id='run-9270dbaa-9edd-4ca4-bb33-3dec0de34957-0'

Description

Hello, according to the documentation ChatHuggingFace supports tool-calling. However, when I run the example from the documentation, it returns the LLM output rather than a function call.

System Info

langchain==0.2.9
langchain-community==0.2.7
langchain-core==0.2.21
langchain-huggingface==0.0.3
langchain-text-splitters==0.2.2

Ubuntu 22.04.3 LTS

Python 3.10.12

@langcarl langcarl bot added the investigate label Jul 19, 2024
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jul 19, 2024
@efriis
Copy link
Member

efriis commented Jul 19, 2024

Howdy! In this section, it clarifies that only the text-generation-inference backends support tool calling, which is why it's not working with HuggingFacePipeline!

I'll add a note to the page you linked that just because a class supports tool calling, not all models/parameters necessarily work with it.

@efriis efriis self-assigned this Jul 19, 2024
@efriis efriis closed this as completed in 50cb0a0 Jul 19, 2024
olgamurraft pushed a commit to olgamurraft/langchain that referenced this issue Aug 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants