Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pip error on m1 pro apple. #27

Closed
CaffreyR opened this issue Jul 12, 2022 · 0 comments · Fixed by #31
Closed

pip error on m1 pro apple. #27

CaffreyR opened this issue Jul 12, 2022 · 0 comments · Fixed by #31

Comments

@CaffreyR
Copy link

Hi, when I tried to pip install it, I encountered problems. @codeKgu

  Created wheel for multimodal_transformers: filename=multimodal_transformers-0.1.4a0-py3-none-any.whl size=20633 sha256=38e7d2cec34c74fd0449e91c591ce8427e66e540bfba2969607951e07bb5ac16
  Stored in directory: /Users/caffrey/Library/Caches/pip/wheels/10/ca/98/03f280d4b1c8bb8ead2f4eb5df5b74bd9736ca209972990112
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [51 lines of output]
      /private/var/folders/x0/z03_gg6j6_ggrj9yck7c6f6c0000gn/T/pip-build-env-9mte3dpz/overlay/lib/python3.9/site-packages/setuptools/dist.py:530: UserWarning: Normalizing '0.8.1.rc2' to '0.8.1rc2'
        warnings.warn(tmpl.format(**locals()))
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.macosx-11.0-arm64-cpython-39
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers
      copying tokenizers/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/models
      copying tokenizers/models/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/models
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/decoders
      copying tokenizers/decoders/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/decoders
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/normalizers
      copying tokenizers/normalizers/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/normalizers
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/pre_tokenizers
      copying tokenizers/pre_tokenizers/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/pre_tokenizers
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/processors
      copying tokenizers/processors/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/processors
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/trainers
      copying tokenizers/trainers/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/trainers
      creating build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/implementations/byte_level_bpe.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/implementations/base_tokenizer.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/implementations/__init__.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/implementations/char_level_bpe.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/implementations/bert_wordpiece.py -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/implementations
      copying tokenizers/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers
      copying tokenizers/models/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/models
      copying tokenizers/decoders/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/decoders
      copying tokenizers/normalizers/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/normalizers
      copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/pre_tokenizers
      copying tokenizers/processors/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/processors
      copying tokenizers/trainers/__init__.pyi -> build/lib.macosx-11.0-arm64-cpython-39/tokenizers/trainers
      running build_ext
      running build_rust
      info: syncing channel updates for 'nightly-2020-05-14-aarch64-apple-darwin'
      info: latest update on 2020-05-14, rust version 1.45.0-nightly (75e1463c5 2020-05-13)
      error: target 'aarch64-apple-darwin' not found in channel.  Perhaps check https://doc.rust-lang.org/nightly/rustc/platform-support.html for available targets
      error: can't find Rust compiler
      
      If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
      
      To update pip, run:
      
          pip install --upgrade pip
      
      and then retry package installation.
      
      If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
  Building wheel for sacremoses (setup.py) ... done
  Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895260 sha256=e0c1c85b58c441c00e8acb472af863785045a148042632a128949c7ed94e6ba2
  Stored in directory: /Users/caffrey/Library/Caches/pip/wheels/12/1c/3d/46cf06718d63a32ff798a89594b61e7f345ab6b36d909ce033
Successfully built multimodal_transformers sacremoses
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant