Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix chat template not applied in TransformersLLM #1083

Merged
merged 6 commits into from
Dec 18, 2024

Conversation

gabrielmbmb
Copy link
Member

No description provided.

Copy link

codspeed-hq bot commented Dec 18, 2024

CodSpeed Performance Report

Merging #1083 will not alter performance

Comparing fix-transformers-prepare-input (36fb528) with main (844165f)

🎉 Hooray! pytest-codspeed just leveled up to 3.1.0!

A heads-up, this is a breaking change and it might affect your current performance baseline a bit. But here's the exciting part - it's packed with new, cool features and promises improved result stability 🥳!
Curious about what's new? Visit our releases page to delve into all the awesome details about this new version.

Summary

✅ 1 untouched benchmarks

Copy link
Member

@davidberenstein1957 davidberenstein1957 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some tests are failing but seems good besides that. You plan on creating a patch right?

Copy link

Documentation for this PR has been built. You can view it at: https://distilabel.argilla.io/pr-1083/

@gabrielmbmb gabrielmbmb merged commit bfc8445 into main Dec 18, 2024
8 checks passed
@gabrielmbmb gabrielmbmb deleted the fix-transformers-prepare-input branch December 18, 2024 16:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants