[Bug/Feature]: vertex_ai/meta/llama-3.2-90b-vision-instruct-maas
isn't always outputting stuff correctly for streaming
#6354
Labels
bug
Something isn't working
What happened?
Sometimes, the
content
field returned forvertex_ai/meta/llama-3.2-90b-vision-instruct-maas
while streaming is invalid.When I do the request directly to Vertex AI, I do see this bug sometimes. So it's not LiteLLM.
I would really like if #5416 became a solution, so we could selectively disable streaming for buggy endpoints ourselves.
Relevant log output
No response
Twitter / LinkedIn details
https://www.linkedin.com/in/davidmanouchehri/
The text was updated successfully, but these errors were encountered: