Value Error when instantiating LLMGraphTransformer object using AzureChatOpenAI related to Native Function Calling Support #24457
-
Checked other resources
Commit to Help
Example Codefrom langchain_experimental.graph_transformers import LLMGraphTransformer
from langchain_openai import AzureChatOpenAI
# Set up chat language model
chat_model_azure = AzureChatOpenAI(
azure_endpoint=AZURE_LLM_ENDPOINT,
deployment_name="gpt-4o",
openai_api_version=AZURE_API_VERSION,
openai_api_key=AZURE_LLM_API_KEY,
temperature=0,
)
llm_transformer = LLMGraphTransformer(
llm=chat_model_azure,
node_properties=["description"],
relationship_properties=["description"]
) DescriptionUsing When attempting to instantiate a LLMGraphTransformer object using an AzureChatOpenAI object I am met with the following error:
According to the OpenAI documentation this version of GPT 4o does support function calling. Could this at all be related to Microsoft deprecating the If so is there any workaround at all? System InfoSystem Information
Package Information
Packages not installed (Not Necessarily a Problem)The following packages were not found:
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
Hey there, @vincentsyha! I'm here to help you out with any bugs, questions, and even becoming a contributor. Let's see what we can do about that issue you're facing. To work around the Here is an example of how to instantiate the from langchain_experimental.graph_transformers import LLMGraphTransformer
from langchain_openai import AzureChatOpenAI
# Set up chat language model
chat_model_azure = AzureChatOpenAI(
azure_endpoint=AZURE_LLM_ENDPOINT,
deployment_name="gpt-4o",
openai_api_version=AZURE_API_VERSION,
openai_api_key=AZURE_LLM_API_KEY,
temperature=0,
)
# Instantiate the LLMGraphTransformer object with node_properties and relationship_properties set to False
llm_transformer = LLMGraphTransformer(
llm=chat_model_azure,
allowed_nodes=["Person", "Company", "Product", "Award", "Characteristic"],
allowed_relationships=["WORKS_FOR", "HAS_AWARD", "PRODUCED_BY", "HAS_CHARACTERISTIC"],
node_properties=False,
relationship_properties=False
) By setting |
Beta Was this translation helpful? Give feedback.
-
Same problem. I'm trying to use the LLMGraphTransformer with "llama3-groq-tool-use" via Ollama. Tried node_properties=True and node_properties=['name', 'document_id'] but got the same error. EDIT: Nevermind... just found out that the "langchain_community.llms.Ollama" class does not implement the "with_structured_output()" method needed for function calling. =( |
Beta Was this translation helpful? Give feedback.
-
In my case the problem was that the version of Upgrading to |
Beta Was this translation helpful? Give feedback.
In my case the problem was that the version of
langchain-openai
in my environment was too low.Upgrading to
0.1.17
resolved this issue for me.