Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

_tf_available for customized built tensorflow #18642

Closed
1 of 4 tasks
kevint324 opened this issue Aug 16, 2022 · 4 comments · Fixed by #18650
Closed
1 of 4 tasks

_tf_available for customized built tensorflow #18642

kevint324 opened this issue Aug 16, 2022 · 4 comments · Fixed by #18650
Labels
Feature request Request for a new feature

Comments

@kevint324
Copy link
Contributor

kevint324 commented Aug 16, 2022

System Info

n/a

Who can help?

@Rocketknight1

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

  File "virtualenv_mlu/lib/python3.8/site-packages/transformers/pipelines/base.py", line 212, in infer_framework_load_model
    raise RuntimeError(
RuntimeError: At least one of TensorFlow 2.0 or PyTorch should be installed. To install TensorFlow 2.0, read the instructions at https://www.tensorflow.org/install/ To install PyTorch, read the instructions at https://pytorch.or

Expected behavior

https://github.com/huggingface/transformers/blob/02b176c4ce14340d26d42825523f406959c6c202/src/transformers/utils/import_utils.py#L63L75

I built a tensorflow-xxu for our in house accelerator and tried to run the transformer example.
I got the RuntimeError indicates tf is not available.
Currently the _tf_available has a hard-coded candidates list.
I'm not suire if extending the existing list is a good idea.
Maybe adding some runtime flexibility would be better?

Thanks
Kevin

@kevint324 kevint324 added the bug label Aug 16, 2022
@ydshieh
Copy link
Collaborator

ydshieh commented Aug 16, 2022

Hi @kevint324 I think it (extending the list ) is fine if you would work with a specific transformers version. But it would be a bit tedious if you want to use newer versions constantly.

cc @Rocketknight1 for the idea regarding adding some runtime flexibility.

@ydshieh ydshieh added Feature request Request for a new feature and removed bug labels Aug 16, 2022
@ydshieh
Copy link
Collaborator

ydshieh commented Aug 16, 2022

Changed the tag to Feature request instead :-)

@Rocketknight1
Copy link
Member

Hi @kevint324, I filed a PR that might resolve this, but I want to check with other maintainers that it's okay before I merge it. In the meantime, can you try it out? Just run pip install git+https://github.com/huggingface/transformers.git@allow_force_tf_availability, then set the environment variable FORCE_TF_AVAILABLE=1 before running your code, and it should skip those checks now.

@kevint324
Copy link
Contributor Author

Yes, it works.
Thanks for the quick fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Request for a new feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants