You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to install old version for transformers with "pip install transformers==3.3.1", when additional tokenizer package needs rust compiler I so followed the steps on this page: https://huggingface.co/docs/tokenizers/installation, then re-running the installation command find that rust could not find "libc":
ransformers==3.3.1) (8.1.3)
Requirement already satisfied: joblib in /scratch/yerong/.conda/envs/reason/lib/python3.11/site-packages (from sacremoses->transformers==3.3.1) (1.2.0)
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [55 lines of output]
/tmp/pip-build-env-bfzcki16/overlay/lib/python3.11/site-packages/setuptools/dist.py:519: InformationOnly: Normalizing '0.8.1.rc2' to '0.8.1rc2'
self.metadata.version = self._normalize_version(
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-311
creating build/lib.linux-x86_64-cpython-311/tokenizers
copying tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers
creating build/lib.linux-x86_64-cpython-311/tokenizers/models
copying tokenizers/models/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/models
creating build/lib.linux-x86_64-cpython-311/tokenizers/decoders
copying tokenizers/decoders/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders
creating build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
copying tokenizers/normalizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
creating build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
copying tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
creating build/lib.linux-x86_64-cpython-311/tokenizers/processors
copying tokenizers/processors/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/processors
creating build/lib.linux-x86_64-cpython-311/tokenizers/trainers
copying tokenizers/trainers/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers
creating build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/implementations/__init__.py -> build/lib.linux-x86_64-cpython-311/tokenizers/implementations
copying tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers
copying tokenizers/models/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/models
copying tokenizers/decoders/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/decoders
copying tokenizers/normalizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/normalizers
copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/pre_tokenizers
copying tokenizers/processors/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/processors
copying tokenizers/trainers/__init__.pyi -> build/lib.linux-x86_64-cpython-311/tokenizers/trainers
running build_ext
running build_rust
cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib
warning: unused manifest key: target.x86_64-apple-darwin.rustflags
Updating crates.io index
warning: spurious network error (2 tries remaining): bad packet length; class=Net (12)
error: failed to get `libc` as a dependency of package `tokenizers-python v0.8.1-rc2 (/tmp/pip-install-8814ldyi/tokenizers_cbab6a736f314261bfaeb5b456c7afe8)`
Caused by:
failed to load source for dependency `libc`
Caused by:
Unable to update registry `https://github.com/rust-lang/crates.io-index`
Caused by:
failed to fetch `https://github.com/rust-lang/crates.io-index`
Caused by:
error reading from the zlib stream; class=Zlib (5)
error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module -- --crate-type cdylib` failed with code 101
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Steps
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"
pip install transformers==3.3.1
Possible Solution(s)
No response
Notes
No response
Version
No response
The text was updated successfully, but these errors were encountered:
Does it run with CARGO_HTTP_MULTIPLEXING=false? Some proxies have problems with http2
If either of those last two work for you, they can also serve as workarounds.
Another potential workaround if you have a new enough cargo is to enable sparse registry support which changes how we do network communication and might bypass the root cause of this problem
Thanks.
CARGO_HTTP_DEBUG=true; CARGO_LOG=cargo::ops::registry=trace; pip install transformers==3.3.1
works for python 3.7 (conda) but not python 3.11 (same conda)
Not sure what is the difference between these two though.
Problem
I am trying to install old version for transformers with "pip install transformers==3.3.1", when additional tokenizer package needs rust compiler I so followed the steps on this page: https://huggingface.co/docs/tokenizers/installation, then re-running the installation command find that rust could not find "libc":
Steps
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"
pip install transformers==3.3.1
Possible Solution(s)
No response
Notes
No response
Version
No response
The text was updated successfully, but these errors were encountered: