Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Misc] Bump up transformers to v4.39.0 & Remove StarCoder2Config #3551

Merged
merged 2 commits into from
Mar 21, 2024

Conversation

WoosukKwon
Copy link
Collaborator

This PR bump up the transformers version to v4.39.0. Thanks to the version up, we can delete StarCoder2Config.

@WoosukKwon WoosukKwon requested a review from ywang96 March 21, 2024 07:23
Copy link
Member

@ywang96 ywang96 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! I went through the release log and don't seem to find any breaking changes related to our transformers usage from upgrading it to 4.39.

@WoosukKwon
Copy link
Collaborator Author

@ywang96 Thanks for the quick review!

@esmeetu
Copy link
Collaborator

esmeetu commented Mar 21, 2024

@WoosukKwon I have addressed conflicts that related to the Jais model. And this PR seems not adding @bufferoverflow as a co-author.

@esmeetu esmeetu disabled auto-merge March 21, 2024 13:46
@WoosukKwon WoosukKwon merged commit c188ecb into main Mar 21, 2024
32 checks passed
@WoosukKwon
Copy link
Collaborator Author

@esmeetu Thanks for fixing the merge error! I actually added him as a co-author in the commit message 😅 . That's why I disabled and re-enabled auto-merge.

@WoosukKwon WoosukKwon deleted the upgrade-hf branch March 21, 2024 14:58
@esmeetu
Copy link
Collaborator

esmeetu commented Mar 21, 2024

But I didn't find that. So werid. 🫨

@WoosukKwon
Copy link
Collaborator Author

Oh it was in the commit message for the scheduled squash merge, so wasn’t visible.

tjohnson31415 added a commit to tjohnson31415/vllm that referenced this pull request Mar 21, 2024
* upstream/main:
  [Misc] Bump up transformers to v4.39.0 & Remove StarCoder2Config (vllm-project#3551)
  [Misc][Log] Add log for tokenizer length not equal to vocabulary size (vllm-project#3500)
  [🚀 Ready to be merged] Added support for Jais models (vllm-project#3183)
  Fix 1D query issue from `_prune_hidden_states` (vllm-project#3539)
  [PREFIX CACHING FOLLOW UP] OrderedDict-based evictor (vllm-project#3431)
  [BugFix] Hot fix in setup.py for neuron build (vllm-project#3537)
  Migrate `logits` computation and gather to `model_runner` (vllm-project#3233)
  [1/n][Chunked Prefill] Refactor input query shapes (vllm-project#3236)
  [1/n] Triton sampling kernel (vllm-project#3186)
  [Bugfix] Fix ROCm support in CMakeLists.txt (vllm-project#3534)
@bufferoverflow
Copy link
Contributor

There is a patch release out, see https://github.com/huggingface/transformers/releases/tag/v4.39.1 maybe bump to this already?

@ywang96
Copy link
Member

ywang96 commented Mar 24, 2024

There is a patch release out, see https://github.com/huggingface/transformers/releases/tag/v4.39.1 maybe bump to this already?

@bufferoverflow Thank you for bringing this up! Yes we were aware of this patch (In fact 4.39.0 was breaking #3042 so we had to raise an issue on transformers because of it huggingface/transformers#29789)

@WoosukKwon What's your opinion on explicitly requiring transformers >= 4.39.1?

@WoosukKwon
Copy link
Collaborator Author

We should do it. Could you submit a PR? Otherwise I will do it tmr.

Temirulan pushed a commit to Temirulan/vllm-whisper that referenced this pull request Sep 6, 2024
…m-project#3551)

Co-authored-by: Roy <jasonailu87@gmail.com>
Co-authored-by: Roger Meier <r.meier@siemens.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants