-
-
Notifications
You must be signed in to change notification settings - Fork 753
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
create_partial streaming not behaving as expected #665
Comments
I was able to reproduce this but I don't really have time this week to take a look. |
@jxnl not sure I'll have time before you but if I can dive in any quick tips on where to look? |
import pydantic_core
print(
pydantic_core.from_json(
'{"name": "jaso',
allow_partial=True,
)
)
# > {}
print(
pydantic_core.from_json(
'{"name": "jaso"',
allow_partial=True,
)
)
# > {'name': 'jaso'} |
@jxnl i see, so does the partial stream have unclosed json up until the last token? Maybe we have to downgrade pydantic |
@jxnl looks like pydantic/pydantic-core#1293 was closed, can you check if it solved the issue with partial streaming? |
https://github.com/pydantic/jiter/releases/tag/v0.4.0 fix released 18hrs ago |
Need to use |
ah! can you make a PR for this? |
Haven't had a chance yet, might be able to this week |
@ellipsis-dev can you try to make this change? and make sure imports are correct? |
@jxnl - I love the instructor package; simply indispensable. |
@ivanleomk can you take a look at this? |
I tested it out with jitter and it seems to work! Will try working on a PR for this from jiter import from_json
from pydantic_core import from_json as from_json_pydantic
partial_json = '{"name": "jaso'
parsed_data = from_json(partial_json.encode(), partial_mode="trailing-strings")
print("(Jitter) : Parsed partial JSON:", parsed_data)
parsed_data = from_json_pydantic(partial_json.encode(), allow_partial=True)
print("(Pydantic) : Parsed partial JSON:", parsed_data) This gives the output below
|
Closing this since #745 addresses this issue |
Great, it now works perfectly with OpenAI and Anthropic |
What Model are you using?
Describe the bug
Partial streaming returns an extraction on every generation for the extraction stream, however fields are
None
on every generation until the final token, which is when it fully fills out. Essentially defeating the entire purpose of streaming.To Reproduce
Expected behavior
This used to add new content on every token output, but now fields are always
None
until the last token for that field completesScreenshots

The text was updated successfully, but these errors were encountered: