Monitoring for multi agent systems (OpenWebUI) #3737
-
Has anybody implemented a Testing system with a real DataSet with Langfuse to test OpenWebUI Chatbots ? i use Langfuse , LiteLLM and OpenWebUI. i configured LiteLLM to make Callbacks to Langfuse to track all the calls. The problem is that the tracking Data Does not have the Id of the User and the conversation , is there an configuration i can make to make that information be sent to Langfuse ? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
Hello @OM-EL! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together! To include user and conversation IDs in the tracking data sent to Langfuse using LiteLLM, you can set the from litellm import completion
# Set custom Langfuse trace parameters
response = completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
],
metadata={
"trace_user_id": "user-id", # Set Langfuse Trace User ID
"session_id": "session-id", # Set Langfuse Session ID
"tags": ["tag1", "tag2"] # Optionally set Langfuse Tags
},
)
print(response) Make sure you have the necessary environment variables set for Langfuse as well [1][2]. |
Beta Was this translation helpful? Give feedback.
-
Hi @OM-EL, the easiest way would be to directly integrate OpenWebUI with Langfuse using the Langfuse pipeline for OpenWebUI. To integrate, follow this guide on OpenWebUI pipelines. We are also releasing a cookbook in our docs on this in the next couple of days. |
Beta Was this translation helpful? Give feedback.
-
@jannikmaierhoefer the problem is that i initialise multiple OpenAI Clients inside the OpenWebUi function , the filter get's applied only to the main call to the llm , all the subcalls don't get tracked. the schema showed that. |
Beta Was this translation helpful? Give feedback.
interesting, thanks for sharing! can you open an issue on the openwebui repo to inquire what's the best solution for this? I think sending the data from openwebui directly would be preferred compared to the litellm callback.
Otherwise, you could try to add user/session tracking as metadata headers to the openwebui-litellm integrations but that might be harder to achieve as it is not really an integration but just a drop-in replacement for openai.