Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility with latest transformers library #19

Open
Sharrnah opened this issue Aug 14, 2024 · 3 comments
Open

Compatibility with latest transformers library #19

Sharrnah opened this issue Aug 14, 2024 · 3 comments

Comments

@Sharrnah
Copy link

Currently, this is not compatible with the latest huggingface transformers library, since it calls some internal methods that have either changed or where removed.

tested with transformers v4.44.0

in the model.py file,

self.whisper_model._set_return_outputs(
            return_dict_in_generate=return_dict_in_generate,
            return_token_timestamps=return_token_timestamps,
            is_shortform=is_shortform,
            logprob_threshold=logprob_threshold,
            generation_config=generation_config,
        )

where the is_shortform argument got removed.

self.whisper_model._set_token_ids(
            generation_config=generation_config, config=self.config, kwargs=kwargs
        )

gets called, but the method _set_token_ids doesn't exist anymore.

There might be other incompatibilities.

this should be updated to be compatible with the latest transformers version.

@dinhduongnguyen
Copy link

yeah, i got the same issue

@AvivSham
Copy link
Collaborator

Hi @Sharrnah and @dinhduongnguyen,
Thank you for your interest in our work!
To use whisper-medusa you should follow the installation steps in the README file. The supported version of transformers is specified in the requirements.txt file. Specifically, the supported version is transformers==4.39.0.

@Sharrnah
Copy link
Author

Hi @Sharrnah and @dinhduongnguyen, Thank you for your interest in our work! To use whisper-medusa you should follow the installation steps in the README file. The supported version of transformers is specified in the requirements.txt file. Specifically, the supported version is transformers==4.39.0.

I am aware of that. But its always good to support the newest version if possible. There are different reasons why someone might want to use this together with a newer transformers version.

For example someone might want to use it together with another library that got updated to support the newest transformers library. Or someone might want to use an AI model that only got support in a more recent transformers version.

As it is now, i prefer to remove whisper medusa support alltogether before downgrading transformers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants